Dec 01 00:07:18 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 00:07:19 crc restorecon[4699]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 00:07:19 crc restorecon[4699]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 00:07:19 crc kubenswrapper[4911]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.965795 4911 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969018 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969037 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969041 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969045 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969049 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969054 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969059 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969065 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969069 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969074 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969079 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969083 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969087 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969091 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969095 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969100 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969105 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969110 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969115 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969118 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969123 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969128 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969132 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969136 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969140 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969144 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969148 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969152 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969156 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969161 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969165 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969169 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969173 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969176 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969180 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969185 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969190 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969195 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969199 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969203 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969207 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969210 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969214 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969218 4911 feature_gate.go:330] unrecognized feature gate: Example Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969221 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969225 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969228 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969232 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969235 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969238 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969242 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969246 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969249 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969253 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969256 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969260 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969264 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969267 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969270 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969274 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969277 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969280 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969285 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969289 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969292 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969296 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969300 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969305 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969309 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969314 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.969318 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969395 4911 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969404 4911 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969414 4911 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969421 4911 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969426 4911 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969431 4911 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969436 4911 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969441 4911 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969446 4911 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969451 4911 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969456 4911 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969476 4911 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969481 4911 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969486 4911 flags.go:64] FLAG: --cgroup-root="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969490 4911 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969494 4911 flags.go:64] FLAG: --client-ca-file="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969500 4911 flags.go:64] FLAG: --cloud-config="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969504 4911 flags.go:64] FLAG: --cloud-provider="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969509 4911 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969515 4911 flags.go:64] FLAG: --cluster-domain="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969519 4911 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969523 4911 flags.go:64] FLAG: --config-dir="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969527 4911 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969531 4911 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969537 4911 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969541 4911 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969545 4911 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969549 4911 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969554 4911 flags.go:64] FLAG: --contention-profiling="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969558 4911 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969562 4911 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969566 4911 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969570 4911 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969575 4911 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969579 4911 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969585 4911 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969589 4911 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969593 4911 flags.go:64] FLAG: --enable-server="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969597 4911 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969603 4911 flags.go:64] FLAG: --event-burst="100" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969607 4911 flags.go:64] FLAG: --event-qps="50" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969611 4911 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969615 4911 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969619 4911 flags.go:64] FLAG: --eviction-hard="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969624 4911 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969628 4911 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969632 4911 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969636 4911 flags.go:64] FLAG: --eviction-soft="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969641 4911 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969644 4911 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969649 4911 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969653 4911 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969657 4911 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969661 4911 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969665 4911 flags.go:64] FLAG: --feature-gates="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969670 4911 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969675 4911 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969679 4911 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969683 4911 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969687 4911 flags.go:64] FLAG: --healthz-port="10248" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969691 4911 flags.go:64] FLAG: --help="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969695 4911 flags.go:64] FLAG: --hostname-override="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969699 4911 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969703 4911 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969707 4911 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969711 4911 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969715 4911 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969719 4911 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969723 4911 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969728 4911 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969732 4911 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969736 4911 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969741 4911 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969744 4911 flags.go:64] FLAG: --kube-reserved="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969749 4911 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969753 4911 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969757 4911 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969761 4911 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969766 4911 flags.go:64] FLAG: --lock-file="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969771 4911 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969776 4911 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969781 4911 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969788 4911 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969792 4911 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969797 4911 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969802 4911 flags.go:64] FLAG: --logging-format="text" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969806 4911 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969810 4911 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969815 4911 flags.go:64] FLAG: --manifest-url="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969819 4911 flags.go:64] FLAG: --manifest-url-header="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969824 4911 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969828 4911 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969833 4911 flags.go:64] FLAG: --max-pods="110" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969837 4911 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969841 4911 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969845 4911 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969849 4911 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969854 4911 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969859 4911 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969863 4911 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969872 4911 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969877 4911 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969881 4911 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969885 4911 flags.go:64] FLAG: --pod-cidr="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969889 4911 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969896 4911 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969900 4911 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969905 4911 flags.go:64] FLAG: --pods-per-core="0" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969908 4911 flags.go:64] FLAG: --port="10250" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969913 4911 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969917 4911 flags.go:64] FLAG: --provider-id="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969921 4911 flags.go:64] FLAG: --qos-reserved="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969926 4911 flags.go:64] FLAG: --read-only-port="10255" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969930 4911 flags.go:64] FLAG: --register-node="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969934 4911 flags.go:64] FLAG: --register-schedulable="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969938 4911 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969945 4911 flags.go:64] FLAG: --registry-burst="10" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969949 4911 flags.go:64] FLAG: --registry-qps="5" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969953 4911 flags.go:64] FLAG: --reserved-cpus="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969957 4911 flags.go:64] FLAG: --reserved-memory="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969962 4911 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969966 4911 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969970 4911 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969974 4911 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969979 4911 flags.go:64] FLAG: --runonce="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969983 4911 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969987 4911 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969991 4911 flags.go:64] FLAG: --seccomp-default="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.969995 4911 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970000 4911 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970004 4911 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970008 4911 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970012 4911 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970017 4911 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970021 4911 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970025 4911 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970029 4911 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970034 4911 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970038 4911 flags.go:64] FLAG: --system-cgroups="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970042 4911 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970048 4911 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970052 4911 flags.go:64] FLAG: --tls-cert-file="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970056 4911 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970061 4911 flags.go:64] FLAG: --tls-min-version="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970065 4911 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970069 4911 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970073 4911 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970077 4911 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970082 4911 flags.go:64] FLAG: --v="2" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970089 4911 flags.go:64] FLAG: --version="false" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970094 4911 flags.go:64] FLAG: --vmodule="" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970099 4911 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970104 4911 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970208 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970213 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970217 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970223 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970228 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970235 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970240 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970245 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970249 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970254 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970258 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970263 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970267 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970272 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970276 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970280 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970284 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970288 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970292 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970296 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970300 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970303 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970307 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970310 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970314 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970317 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970320 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970324 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970328 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970331 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970335 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970338 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970342 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970345 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970348 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970352 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970355 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970360 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970364 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970367 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970371 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970374 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970378 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970382 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970385 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970389 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970392 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970396 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970400 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970403 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970407 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970410 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970414 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970418 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970421 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970424 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970428 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970431 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970435 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970438 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970443 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970447 4911 feature_gate.go:330] unrecognized feature gate: Example Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970451 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970455 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970472 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970476 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970480 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970483 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970487 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970491 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.970495 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.970500 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.981423 4911 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.981494 4911 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981655 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981678 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981687 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981697 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981707 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981714 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981722 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981730 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981737 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981745 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981753 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981761 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981769 4911 feature_gate.go:330] unrecognized feature gate: Example Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981778 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981787 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981799 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981810 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981821 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981830 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981839 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981847 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981855 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981863 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981871 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981881 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981892 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981902 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981910 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981918 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981926 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981935 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981943 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981952 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981960 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981969 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981978 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981986 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.981994 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982002 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982011 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982020 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982028 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982037 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982046 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982054 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982065 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982075 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982084 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982092 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982101 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982109 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982119 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982127 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982134 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982142 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982150 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982158 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982166 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982173 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982181 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982189 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982196 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982204 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982213 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982221 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982229 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982236 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982244 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982252 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982259 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982268 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.982281 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982528 4911 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982543 4911 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982552 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982561 4911 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982569 4911 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982577 4911 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982585 4911 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982592 4911 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982600 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982608 4911 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982616 4911 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982624 4911 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982631 4911 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982639 4911 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982647 4911 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982655 4911 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982663 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982670 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982678 4911 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982685 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982693 4911 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982701 4911 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982708 4911 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982717 4911 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982724 4911 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982733 4911 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982741 4911 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982749 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982756 4911 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982767 4911 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982777 4911 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982786 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982796 4911 feature_gate.go:330] unrecognized feature gate: Example Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982805 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982814 4911 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982823 4911 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982834 4911 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982843 4911 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982853 4911 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982861 4911 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982869 4911 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982877 4911 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982885 4911 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982893 4911 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982900 4911 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982908 4911 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982916 4911 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982923 4911 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982932 4911 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982939 4911 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982947 4911 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982955 4911 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982962 4911 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982971 4911 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982978 4911 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982986 4911 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.982994 4911 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983003 4911 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983012 4911 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983020 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983028 4911 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983036 4911 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983044 4911 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983052 4911 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983060 4911 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983067 4911 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983076 4911 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983086 4911 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983093 4911 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983101 4911 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 00:07:19 crc kubenswrapper[4911]: W1201 00:07:19.983110 4911 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.983122 4911 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.983595 4911 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.987869 4911 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.988023 4911 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.990061 4911 server.go:997] "Starting client certificate rotation" Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.990385 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.990629 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 11:10:40.046412253 +0000 UTC Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.990697 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 00:07:19 crc kubenswrapper[4911]: I1201 00:07:19.997744 4911 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 00:07:19 crc kubenswrapper[4911]: E1201 00:07:19.998377 4911 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.000909 4911 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.011560 4911 log.go:25] "Validated CRI v1 runtime API" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.034555 4911 log.go:25] "Validated CRI v1 image API" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.036769 4911 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.039419 4911 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-00-02-15-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.039514 4911 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.069400 4911 manager.go:217] Machine: {Timestamp:2025-12-01 00:07:20.067052673 +0000 UTC m=+0.205749524 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:fe489437-a045-4085-a506-8b5514dd1af7 BootID:b4d95f07-110d-43d3-9dda-782c8849ca6a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:20:03:ad Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:20:03:ad Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c5:c1:c6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5f:f8:33 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f8:c1:92 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ec:b2:ab Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:f6:4c:bd:80:0d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:68:60:f0:1c:15 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.069929 4911 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.070303 4911 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.070816 4911 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071062 4911 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071112 4911 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071417 4911 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071433 4911 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071837 4911 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.071896 4911 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.072160 4911 state_mem.go:36] "Initialized new in-memory state store" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.072526 4911 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.073561 4911 kubelet.go:418] "Attempting to sync node with API server" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.073597 4911 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.073632 4911 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.073656 4911 kubelet.go:324] "Adding apiserver pod source" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.073673 4911 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.076956 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.078633 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.078757 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.077904 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.079533 4911 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.080320 4911 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.081427 4911 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.082388 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.082574 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.082685 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.082789 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.082901 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083045 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083149 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083276 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083382 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083526 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.083679 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.084080 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.084578 4911 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.085329 4911 server.go:1280] "Started kubelet" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.085509 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.085819 4911 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.085821 4911 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.086684 4911 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 00:07:20 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.087907 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187ceebacc89753f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 00:07:20.085280063 +0000 UTC m=+0.223976844,LastTimestamp:2025-12-01 00:07:20.085280063 +0000 UTC m=+0.223976844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.089200 4911 server.go:460] "Adding debug handlers to kubelet server" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090548 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090611 4911 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090636 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:46:06.693923683 +0000 UTC Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090687 4911 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 908h38m46.603239338s for next certificate rotation Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.090806 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090899 4911 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.090913 4911 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.091034 4911 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.092665 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.092759 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.095626 4911 factory.go:55] Registering systemd factory Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.095709 4911 factory.go:221] Registration of the systemd container factory successfully Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.099103 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="200ms" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.099252 4911 factory.go:153] Registering CRI-O factory Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.099662 4911 factory.go:221] Registration of the crio container factory successfully Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.100973 4911 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.101546 4911 factory.go:103] Registering Raw factory Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.101764 4911 manager.go:1196] Started watching for new ooms in manager Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.102941 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103073 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103151 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103165 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103185 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103198 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103212 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103232 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103322 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103368 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103381 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103539 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103568 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103622 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103743 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103763 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103783 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103828 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103882 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103897 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.103980 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104069 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104088 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104110 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104385 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104481 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104537 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104559 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104573 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104591 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104687 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104703 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104758 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104815 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104830 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104935 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104961 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.104983 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105001 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105017 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105034 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105079 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105103 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105118 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105132 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105150 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105169 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105195 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105297 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105341 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105386 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105400 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105551 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105620 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105645 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105726 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105749 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105768 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105787 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105806 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105810 4911 manager.go:319] Starting recovery of all containers Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105821 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105875 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105889 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105902 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105921 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105942 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105961 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.105975 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106003 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106106 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106125 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106145 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106223 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106240 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106258 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106346 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106368 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106404 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106419 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106530 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106546 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106565 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106579 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106592 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106617 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106740 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106761 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106775 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106857 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106875 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106891 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106906 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106921 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.106957 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107007 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107020 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107033 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107073 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107111 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107133 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107405 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107714 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107747 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107847 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107967 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.107994 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108013 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108074 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108096 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108197 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108219 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108283 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108303 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108544 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108633 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108671 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108721 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108746 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108787 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108837 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108927 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.108986 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109044 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109080 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109123 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109146 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109188 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109240 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109272 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109299 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109387 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109430 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109449 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109515 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109551 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109570 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109605 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109623 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109682 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109739 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109758 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109794 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109858 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109872 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109889 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109907 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109960 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.109999 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.110015 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.110085 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.110120 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111222 4911 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111257 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111282 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111298 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111338 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111357 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111373 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111388 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111406 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111456 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111497 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111573 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111694 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111717 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111732 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111750 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111786 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111801 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111820 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111835 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111912 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111947 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111964 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111983 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.111998 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112017 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112067 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112135 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112170 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112185 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112205 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112222 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112237 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112257 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112271 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112287 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112327 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112341 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112382 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112397 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112412 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112429 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112480 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112530 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112563 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112610 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112629 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112701 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112756 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112773 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112789 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112808 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112844 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112885 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112900 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112952 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112972 4911 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112986 4911 reconstruct.go:97] "Volume reconstruction finished" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.112997 4911 reconciler.go:26] "Reconciler: start to sync state" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.136505 4911 manager.go:324] Recovery completed Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.146583 4911 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.150228 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.150272 4911 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.150378 4911 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.150420 4911 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.150526 4911 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.152249 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.152352 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.155388 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.155499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.155529 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.156972 4911 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.157080 4911 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.157178 4911 state_mem.go:36] "Initialized new in-memory state store" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.172166 4911 policy_none.go:49] "None policy: Start" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.173392 4911 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.173541 4911 state_mem.go:35] "Initializing new in-memory state store" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.191622 4911 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.232101 4911 manager.go:334] "Starting Device Plugin manager" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.232200 4911 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.232221 4911 server.go:79] "Starting device plugin registration server" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.232916 4911 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.232933 4911 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.233302 4911 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.233421 4911 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.233429 4911 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.240922 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.250665 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.250889 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.260885 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.261002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.261110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.261419 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.261555 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.261614 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262828 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262880 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262899 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.262946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.263142 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.263208 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.263264 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.264049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.264076 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.264086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.264421 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.264902 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265082 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265394 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265569 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265680 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.265713 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.269891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.269975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.269990 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.270698 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.271647 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.271804 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.271839 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.271850 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.272138 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.272255 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.272336 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.273454 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.273598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.273624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.300275 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="400ms" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316374 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316474 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316539 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316565 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316591 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316610 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316631 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316649 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316717 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316789 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316819 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.316947 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.317020 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.317042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.333390 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.335055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.335096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.335109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.335135 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.335379 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418142 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418216 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418236 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418257 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418277 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418298 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418317 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418335 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418353 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418370 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418386 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418404 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418423 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418452 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418493 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418522 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418545 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418444 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418416 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418538 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418509 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418587 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418634 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418661 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418671 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418680 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418704 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.418847 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.536581 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.538316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.538379 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.538398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.538440 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.539094 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.592874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.599240 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.621190 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.631786 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0954a52bdf53adc4199b6563ce2936e836a34c89d52d9660a37d501b3d128b8f WatchSource:0}: Error finding container 0954a52bdf53adc4199b6563ce2936e836a34c89d52d9660a37d501b3d128b8f: Status 404 returned error can't find the container with id 0954a52bdf53adc4199b6563ce2936e836a34c89d52d9660a37d501b3d128b8f Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.636281 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-432cf23a610bcd1f3f77245215352ca6a5b5a838860c45a717adaf517b86607a WatchSource:0}: Error finding container 432cf23a610bcd1f3f77245215352ca6a5b5a838860c45a717adaf517b86607a: Status 404 returned error can't find the container with id 432cf23a610bcd1f3f77245215352ca6a5b5a838860c45a717adaf517b86607a Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.640837 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.642949 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.689969 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-871031886a780030d1f8698577b3e20a04d96c0c67c3f18a93cd1e9a64aa8422 WatchSource:0}: Error finding container 871031886a780030d1f8698577b3e20a04d96c0c67c3f18a93cd1e9a64aa8422: Status 404 returned error can't find the container with id 871031886a780030d1f8698577b3e20a04d96c0c67c3f18a93cd1e9a64aa8422 Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.701042 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="800ms" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.708645 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-98a26b850208cb09f6632a1eee39ba740b27fa9277db4b445a9491cd72902280 WatchSource:0}: Error finding container 98a26b850208cb09f6632a1eee39ba740b27fa9277db4b445a9491cd72902280: Status 404 returned error can't find the container with id 98a26b850208cb09f6632a1eee39ba740b27fa9277db4b445a9491cd72902280 Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.712672 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-27cfd2e3ae6a9c2f2d969da0cb078731e8ce3da702e97af996b20975abb8fcf2 WatchSource:0}: Error finding container 27cfd2e3ae6a9c2f2d969da0cb078731e8ce3da702e97af996b20975abb8fcf2: Status 404 returned error can't find the container with id 27cfd2e3ae6a9c2f2d969da0cb078731e8ce3da702e97af996b20975abb8fcf2 Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.910312 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.910449 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: W1201 00:07:20.929943 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.930041 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.939405 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.941967 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.942009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.942024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:20 crc kubenswrapper[4911]: I1201 00:07:20.942053 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:20 crc kubenswrapper[4911]: E1201 00:07:20.942496 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.087032 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.161992 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"871031886a780030d1f8698577b3e20a04d96c0c67c3f18a93cd1e9a64aa8422"} Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.162974 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0954a52bdf53adc4199b6563ce2936e836a34c89d52d9660a37d501b3d128b8f"} Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.164529 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"432cf23a610bcd1f3f77245215352ca6a5b5a838860c45a717adaf517b86607a"} Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.166959 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"27cfd2e3ae6a9c2f2d969da0cb078731e8ce3da702e97af996b20975abb8fcf2"} Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.168201 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98a26b850208cb09f6632a1eee39ba740b27fa9277db4b445a9491cd72902280"} Dec 01 00:07:21 crc kubenswrapper[4911]: E1201 00:07:21.503544 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="1.6s" Dec 01 00:07:21 crc kubenswrapper[4911]: W1201 00:07:21.523293 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:21 crc kubenswrapper[4911]: E1201 00:07:21.523408 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:21 crc kubenswrapper[4911]: W1201 00:07:21.642637 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:21 crc kubenswrapper[4911]: E1201 00:07:21.642768 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.743285 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.745656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.745692 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.745705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:21 crc kubenswrapper[4911]: I1201 00:07:21.745737 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:21 crc kubenswrapper[4911]: E1201 00:07:21.746235 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.086890 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.130193 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 00:07:22 crc kubenswrapper[4911]: E1201 00:07:22.131283 4911 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.180737 4911 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205" exitCode=0 Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.180819 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.181041 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183523 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183723 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183764 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.183779 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.185608 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57" exitCode=0 Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.185677 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.185789 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.187146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.187181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.187197 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.188108 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f" exitCode=0 Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.188189 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.188235 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.189131 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.189162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.189174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.190253 4911 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377" exitCode=0 Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.190292 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377"} Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.190392 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.190580 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191557 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191592 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:22 crc kubenswrapper[4911]: I1201 00:07:22.191688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.196045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.196170 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.197924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.197966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.197978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.198861 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334" exitCode=0 Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.198979 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.199021 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.200102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.200125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.200135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.202012 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.202045 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.202058 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.202069 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.208521 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fb1e7d4fb61774dd975be9022a9fc49669ba3d40607f3b5b14981ce21558f790"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.208584 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.209230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.209260 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.209273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.210894 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.210943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.210966 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15"} Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.211094 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.212153 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.212188 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.212205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.346803 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.353084 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.353117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.353133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.353166 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:23 crc kubenswrapper[4911]: I1201 00:07:23.399938 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.217954 4911 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9" exitCode=0 Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.218104 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9"} Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.218142 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.220599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.220665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.220688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.224984 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0"} Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.225082 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.225143 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.225232 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.225279 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.225148 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.226710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.226752 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.226771 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227233 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227251 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227317 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.227509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.508589 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:24 crc kubenswrapper[4911]: I1201 00:07:24.631664 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234585 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73"} Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234636 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234658 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df"} Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234681 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea"} Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.234855 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236360 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.236419 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:25 crc kubenswrapper[4911]: I1201 00:07:25.402692 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.013774 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.246214 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3"} Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.246299 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345"} Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.246368 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.246401 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.246400 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248564 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.248872 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.330239 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.400945 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 00:07:26 crc kubenswrapper[4911]: I1201 00:07:26.401041 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.249413 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.249424 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251580 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:27 crc kubenswrapper[4911]: I1201 00:07:27.251607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.032599 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.032848 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.034430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.034500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.034513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.044930 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.251906 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.253600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.253652 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.253665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.595598 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.595867 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.597356 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.597408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.597426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:28 crc kubenswrapper[4911]: I1201 00:07:28.789174 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.255353 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.257011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.257089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.257108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.482947 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.483248 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.484924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.485005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.485026 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.519373 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.519745 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.521336 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.521391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:29 crc kubenswrapper[4911]: I1201 00:07:29.521408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:30 crc kubenswrapper[4911]: E1201 00:07:30.241102 4911 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 00:07:33 crc kubenswrapper[4911]: I1201 00:07:33.087270 4911 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.105069 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.355033 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 00:07:33 crc kubenswrapper[4911]: W1201 00:07:33.731204 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 00:07:33 crc kubenswrapper[4911]: I1201 00:07:33.731316 4911 trace.go:236] Trace[310390517]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 00:07:23.729) (total time: 10001ms): Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[310390517]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:33.731) Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[310390517]: [10.001899661s] [10.001899661s] END Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.731341 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 00:07:33 crc kubenswrapper[4911]: W1201 00:07:33.963632 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 00:07:33 crc kubenswrapper[4911]: I1201 00:07:33.963718 4911 trace.go:236] Trace[1848288593]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 00:07:23.962) (total time: 10001ms): Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[1848288593]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:33.963) Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[1848288593]: [10.001461s] [10.001461s] END Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.963740 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 00:07:33 crc kubenswrapper[4911]: W1201 00:07:33.964869 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 00:07:33 crc kubenswrapper[4911]: I1201 00:07:33.964932 4911 trace.go:236] Trace[5898325]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 00:07:23.964) (total time: 10000ms): Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[5898325]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (00:07:33.964) Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[5898325]: [10.000856176s] [10.000856176s] END Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.964947 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 00:07:33 crc kubenswrapper[4911]: W1201 00:07:33.970181 4911 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 00:07:33 crc kubenswrapper[4911]: I1201 00:07:33.970248 4911 trace.go:236] Trace[1651798773]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 00:07:23.968) (total time: 10001ms): Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[1651798773]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:33.970) Dec 01 00:07:33 crc kubenswrapper[4911]: Trace[1651798773]: [10.001493952s] [10.001493952s] END Dec 01 00:07:33 crc kubenswrapper[4911]: E1201 00:07:33.970264 4911 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 00:07:34 crc kubenswrapper[4911]: E1201 00:07:34.132993 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187ceebacc89753f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 00:07:20.085280063 +0000 UTC m=+0.223976844,LastTimestamp:2025-12-01 00:07:20.085280063 +0000 UTC m=+0.223976844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.450362 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.450432 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.461015 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.461088 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.637490 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]log ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]etcd ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/priority-and-fairness-filter ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-apiextensions-informers ok Dec 01 00:07:34 crc kubenswrapper[4911]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/crd-informer-synced ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-system-namespaces-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 01 00:07:34 crc kubenswrapper[4911]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 01 00:07:34 crc kubenswrapper[4911]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/bootstrap-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/start-kube-aggregator-informers ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-registration-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-discovery-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]autoregister-completion ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-openapi-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 01 00:07:34 crc kubenswrapper[4911]: livez check failed Dec 01 00:07:34 crc kubenswrapper[4911]: I1201 00:07:34.637555 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.401419 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.401568 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.555276 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.557395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.557506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.557527 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:36 crc kubenswrapper[4911]: I1201 00:07:36.557561 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:36 crc kubenswrapper[4911]: E1201 00:07:36.563377 4911 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 00:07:37 crc kubenswrapper[4911]: I1201 00:07:37.620047 4911 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 00:07:38 crc kubenswrapper[4911]: I1201 00:07:38.796082 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:38 crc kubenswrapper[4911]: I1201 00:07:38.796280 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:38 crc kubenswrapper[4911]: I1201 00:07:38.797846 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:38 crc kubenswrapper[4911]: I1201 00:07:38.797904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:38 crc kubenswrapper[4911]: I1201 00:07:38.797924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.126750 4911 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.325370 4911 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.452093 4911 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.459252 4911 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.460380 4911 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.507574 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57054->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.507868 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57054->192.168.126.11:17697: read: connection reset by peer" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.641946 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.642880 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.642967 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.649728 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.844971 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 00:07:39 crc kubenswrapper[4911]: I1201 00:07:39.860602 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.086125 4911 apiserver.go:52] "Watching apiserver" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.090150 4911 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.090731 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.091386 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.091508 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.091599 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.091697 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.091876 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.091916 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.092020 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.092806 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.092926 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.094971 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.095731 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.095764 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.095742 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.095941 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.096045 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.096378 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.096534 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.096575 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.120449 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.136032 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165328 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165395 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165437 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165511 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165575 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165623 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165664 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165773 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.165829 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.166400 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.167092 4911 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.175391 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.176132 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.191939 4911 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.537800 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.537844 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.537871 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.538007 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:41.037973523 +0000 UTC m=+21.176670304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.539358 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.539699 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.539982 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540115 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540178 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540328 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540390 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540533 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540620 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540672 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540725 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.540763 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541109 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541275 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541393 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541732 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541782 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541805 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541831 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541863 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541887 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541920 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541949 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541970 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.541989 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542020 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542047 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542076 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542100 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542130 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542156 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542184 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542218 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542245 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542319 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542342 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542360 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542398 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542413 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542432 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542449 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542536 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542566 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542566 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542642 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542858 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542878 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542941 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542961 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542977 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.542998 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543094 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543271 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543439 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.543614 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:07:41.043574428 +0000 UTC m=+21.182271199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543112 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543341 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543703 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543784 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543829 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543370 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543912 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543926 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543967 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.543976 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544097 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544197 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544237 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544274 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544438 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544541 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544572 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544609 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544608 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544663 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544722 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544742 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544708 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544162 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544816 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544843 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.544887 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545140 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545188 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545216 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545245 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545244 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545276 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545346 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545370 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545505 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545547 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545585 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545704 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545743 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546068 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546099 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546121 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546519 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546552 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546572 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546591 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546616 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546645 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546662 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546680 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546700 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546720 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546742 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546765 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546786 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546805 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.545817 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546431 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546435 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546605 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546759 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.546876 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.547300 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.547705 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.547906 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.548159 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.548515 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.548630 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.550157 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554019 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554052 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554031 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554525 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554545 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.554856 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555121 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555184 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555327 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555395 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555476 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555502 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555526 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555552 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555577 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555599 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555611 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555620 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555693 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555746 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555774 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555797 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555819 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555864 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.555891 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556040 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556153 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556178 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556200 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556228 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556250 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556270 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556292 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556335 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556362 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556385 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556433 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556481 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556508 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556535 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556562 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556587 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556615 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556642 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556671 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556697 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556728 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556755 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556782 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556809 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556839 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556869 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556897 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556925 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556967 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556987 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557027 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557052 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557078 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557098 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557119 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557140 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557161 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557183 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557203 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557223 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557243 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557262 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557303 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557326 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557347 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557405 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557424 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557441 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557485 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557507 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557527 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557547 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557565 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557583 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557610 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557637 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557662 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557684 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557708 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557734 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557758 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557802 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557828 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557854 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557881 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557908 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557938 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557967 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557995 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558025 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558059 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558087 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558114 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558169 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558213 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558243 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558267 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558293 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558320 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558347 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558366 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558386 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558405 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558424 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558443 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558489 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558516 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558594 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558627 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558669 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558709 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558747 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558769 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558870 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558884 4911 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558895 4911 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558906 4911 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558917 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558927 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558937 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558948 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558958 4911 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.558968 4911 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559002 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559012 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559023 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559035 4911 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559045 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559055 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559067 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559078 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559088 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559098 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559108 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559118 4911 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559130 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559141 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559151 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559162 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559174 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559185 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559195 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559204 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559214 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559225 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559236 4911 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559246 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559256 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559266 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559295 4911 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559305 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559315 4911 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559325 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559335 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559345 4911 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559360 4911 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559371 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559381 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559390 4911 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.556492 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557488 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557622 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557744 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557769 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.557991 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.559581 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566478 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.563286 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.563412 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.563661 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.563847 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564536 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564537 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564557 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564823 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564933 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.564950 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566615 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.564997 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565258 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565389 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565422 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565711 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565760 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565823 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565876 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.565922 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566013 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566150 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566219 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566270 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566379 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566871 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566890 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566553 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566597 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.566801 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567044 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567304 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567375 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567524 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567603 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.567778 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.567877 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:41.067847942 +0000 UTC m=+21.206544753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.567912 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.568017 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.568313 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.569098 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.569376 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.569873 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.569949 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:41.069930192 +0000 UTC m=+21.208626993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.575742 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.576231 4911 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.576755 4911 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577113 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577301 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577525 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577670 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577680 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577753 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.577859 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.578070 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.578327 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.578501 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.579492 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.579791 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.579840 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.580370 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.580441 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.580444 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.580503 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.580535 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: E1201 00:07:40.580590 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:41.080561967 +0000 UTC m=+21.219258738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.580751 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.580986 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.581200 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.581294 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.581946 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.581971 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.582400 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.582789 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583095 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583266 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583561 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583732 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583878 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.583888 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.584285 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.584527 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586108 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586482 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586595 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586767 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586846 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.586931 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587085 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587137 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587184 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587276 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587195 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.587789 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.588440 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589232 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589503 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589571 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.591965 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589890 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.596943 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.596935 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589608 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.597110 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589624 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.589858 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590071 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590086 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590371 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590441 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590505 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.588604 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590755 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.590897 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.591338 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.591242 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.597356 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.597617 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.598040 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.598528 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599022 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.598984 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599118 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599314 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599576 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599875 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599992 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.600040 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.599994 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.600868 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.601220 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.601354 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.601761 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.602687 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.602873 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.608305 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.608575 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.611714 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.613239 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.613607 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.614441 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.614540 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.614946 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.614774 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.615303 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.615549 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.615588 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.615886 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.616001 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.616297 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.616296 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.616620 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.616775 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.617069 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.617106 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.617598 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.620185 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.630150 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.631422 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.643617 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.656806 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660444 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660535 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660550 4911 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660563 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660576 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660588 4911 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660601 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660614 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660606 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660626 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660719 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660743 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660764 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660786 4911 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660806 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660825 4911 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660846 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660865 4911 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660887 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660906 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660926 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660946 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660968 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.660989 4911 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661011 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661030 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661049 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661068 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661090 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661109 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661128 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661148 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661167 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661186 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661204 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661223 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661242 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661261 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661281 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661300 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661319 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661338 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661357 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661375 4911 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661394 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661412 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661430 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661451 4911 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661497 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661516 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661534 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661553 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661571 4911 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661591 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661609 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661628 4911 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661647 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661667 4911 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661685 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661704 4911 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661725 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661743 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661761 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661780 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661798 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661816 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661837 4911 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661856 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661874 4911 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661892 4911 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661911 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661929 4911 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661948 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661967 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.661987 4911 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662008 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662074 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662094 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662115 4911 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662136 4911 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662174 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662195 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662215 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662234 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662255 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662274 4911 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662295 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662314 4911 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662333 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662352 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662372 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662391 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662410 4911 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662427 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662445 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662488 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662509 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662528 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662547 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662567 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662587 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662606 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662626 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662645 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662663 4911 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662681 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662699 4911 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662718 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662738 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662755 4911 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662775 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662794 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662812 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662830 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662849 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662867 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662886 4911 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662904 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662923 4911 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662942 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662961 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662979 4911 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.662996 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663015 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663033 4911 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663051 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663069 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663088 4911 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663106 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663124 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663143 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663161 4911 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663180 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663198 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663216 4911 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663236 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663254 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663272 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663290 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663308 4911 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663326 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663344 4911 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663361 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663379 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663397 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663416 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663434 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.663451 4911 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.680874 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.692223 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.702886 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.714450 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.717168 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.733423 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.736194 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:07:40 crc kubenswrapper[4911]: W1201 00:07:40.738683 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d749d6b9d7acee3ff73184d61ec9c41b4a99fec563c71e05b29d10e351164313 WatchSource:0}: Error finding container d749d6b9d7acee3ff73184d61ec9c41b4a99fec563c71e05b29d10e351164313: Status 404 returned error can't find the container with id d749d6b9d7acee3ff73184d61ec9c41b4a99fec563c71e05b29d10e351164313 Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.738712 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.738821 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 00:07:40 crc kubenswrapper[4911]: W1201 00:07:40.752323 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-956ae91904cdffb5c59bfbc397d724cf4764966d2574665efa54b04effeb1fb1 WatchSource:0}: Error finding container 956ae91904cdffb5c59bfbc397d724cf4764966d2574665efa54b04effeb1fb1: Status 404 returned error can't find the container with id 956ae91904cdffb5c59bfbc397d724cf4764966d2574665efa54b04effeb1fb1 Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.752361 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: W1201 00:07:40.758619 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-40bbb51102f548c354b1cbeeebf72659a0de671793ed2d13a79cb7bc5f5c081f WatchSource:0}: Error finding container 40bbb51102f548c354b1cbeeebf72659a0de671793ed2d13a79cb7bc5f5c081f: Status 404 returned error can't find the container with id 40bbb51102f548c354b1cbeeebf72659a0de671793ed2d13a79cb7bc5f5c081f Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.764739 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.764780 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.778677 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.846953 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.864644 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.878883 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.897138 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.918494 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.931469 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:40 crc kubenswrapper[4911]: I1201 00:07:40.944133 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.067123 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.067398 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:07:42.067346249 +0000 UTC m=+22.206043040 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.067831 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.067860 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.067972 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.068033 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:42.068017275 +0000 UTC m=+22.206714046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.068134 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.068179 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.068200 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.068272 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:42.06825856 +0000 UTC m=+22.206955341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.151476 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.151603 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.168216 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.168270 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168373 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168387 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168397 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168409 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168444 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:42.168430719 +0000 UTC m=+22.307127490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:41 crc kubenswrapper[4911]: E1201 00:07:41.168647 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:42.168603283 +0000 UTC m=+22.307300054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.174660 4911 csr.go:261] certificate signing request csr-5rvfx is approved, waiting to be issued Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.197733 4911 csr.go:257] certificate signing request csr-5rvfx is issued Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.220787 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8ml8w"] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.221178 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.224680 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pt7lz"] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.224948 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.226251 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.226548 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.226743 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.226910 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.227132 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.227358 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.230879 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.242184 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.255297 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.269226 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68489275-7ca7-441e-9591-bf6993da0b1a-host\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.269262 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8x9h\" (UniqueName: \"kubernetes.io/projected/68489275-7ca7-441e-9591-bf6993da0b1a-kube-api-access-j8x9h\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.269307 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68489275-7ca7-441e-9591-bf6993da0b1a-serviceca\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.269323 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qqt\" (UniqueName: \"kubernetes.io/projected/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-kube-api-access-s8qqt\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.269345 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-hosts-file\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.274021 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.283296 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.327776 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.337666 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.347860 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.359306 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.367829 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.369986 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68489275-7ca7-441e-9591-bf6993da0b1a-serviceca\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370011 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qqt\" (UniqueName: \"kubernetes.io/projected/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-kube-api-access-s8qqt\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370039 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-hosts-file\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8x9h\" (UniqueName: \"kubernetes.io/projected/68489275-7ca7-441e-9591-bf6993da0b1a-kube-api-access-j8x9h\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370119 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68489275-7ca7-441e-9591-bf6993da0b1a-host\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370184 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68489275-7ca7-441e-9591-bf6993da0b1a-host\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.370246 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-hosts-file\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.371005 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68489275-7ca7-441e-9591-bf6993da0b1a-serviceca\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.377404 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.385256 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.395292 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.406926 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.417697 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.428248 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.445911 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.457799 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qqt\" (UniqueName: \"kubernetes.io/projected/9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40-kube-api-access-s8qqt\") pod \"node-resolver-pt7lz\" (UID: \"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\") " pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.458229 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8x9h\" (UniqueName: \"kubernetes.io/projected/68489275-7ca7-441e-9591-bf6993da0b1a-kube-api-access-j8x9h\") pod \"node-ca-8ml8w\" (UID: \"68489275-7ca7-441e-9591-bf6993da0b1a\") " pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.460645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.475378 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.484999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.534516 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8ml8w" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.540334 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pt7lz" Dec 01 00:07:41 crc kubenswrapper[4911]: W1201 00:07:41.545815 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68489275_7ca7_441e_9591_bf6993da0b1a.slice/crio-32617cb81baab6d520006ddf5201ea85b87246191cfeb53b2902b21704808d5e WatchSource:0}: Error finding container 32617cb81baab6d520006ddf5201ea85b87246191cfeb53b2902b21704808d5e: Status 404 returned error can't find the container with id 32617cb81baab6d520006ddf5201ea85b87246191cfeb53b2902b21704808d5e Dec 01 00:07:41 crc kubenswrapper[4911]: W1201 00:07:41.553951 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8dbb7c_c86c_4fd7_8dbe_5ef321480b40.slice/crio-1e3b6955fdf57aac1ee326efd49b8329aafab4e162a606d1857ad74d9b732f68 WatchSource:0}: Error finding container 1e3b6955fdf57aac1ee326efd49b8329aafab4e162a606d1857ad74d9b732f68: Status 404 returned error can't find the container with id 1e3b6955fdf57aac1ee326efd49b8329aafab4e162a606d1857ad74d9b732f68 Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.558175 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.558213 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d749d6b9d7acee3ff73184d61ec9c41b4a99fec563c71e05b29d10e351164313"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.558770 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8ml8w" event={"ID":"68489275-7ca7-441e-9591-bf6993da0b1a","Type":"ContainerStarted","Data":"32617cb81baab6d520006ddf5201ea85b87246191cfeb53b2902b21704808d5e"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.559298 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"956ae91904cdffb5c59bfbc397d724cf4764966d2574665efa54b04effeb1fb1"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.560401 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.570218 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0" exitCode=255 Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.570283 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.570795 4911 scope.go:117] "RemoveContainer" containerID="2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.576405 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pt7lz" event={"ID":"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40","Type":"ContainerStarted","Data":"1e3b6955fdf57aac1ee326efd49b8329aafab4e162a606d1857ad74d9b732f68"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.591401 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.591439 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.591451 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"40bbb51102f548c354b1cbeeebf72659a0de671793ed2d13a79cb7bc5f5c081f"} Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.610738 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.631155 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-h54fr"] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.631598 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.632345 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.632737 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cp4w9"] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.632900 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hmfxk"] Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.633329 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.633566 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.634507 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.636328 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.636664 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.636793 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.636929 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.637091 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.637255 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.637380 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.637560 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.638606 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.638659 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.639189 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.650584 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.673804 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675022 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-multus\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675057 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-system-cni-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675079 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-k8s-cni-cncf-io\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675095 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-multus-certs\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675121 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-system-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675136 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5758q\" (UniqueName: \"kubernetes.io/projected/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-kube-api-access-5758q\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675153 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-socket-dir-parent\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675168 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-etc-kubernetes\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675182 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-hostroot\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675206 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cnibin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675222 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-kubelet\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675236 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-bin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675252 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470f170b-eeab-4f43-bd48-18e50771289a-proxy-tls\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675279 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-netns\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675293 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675307 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-os-release\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675320 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-conf-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675334 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-cnibin\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675358 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675391 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4bh\" (UniqueName: \"kubernetes.io/projected/470f170b-eeab-4f43-bd48-18e50771289a-kube-api-access-jx4bh\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675413 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470f170b-eeab-4f43-bd48-18e50771289a-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675436 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn52n\" (UniqueName: \"kubernetes.io/projected/b7e63b3d-a855-4971-8a5a-995fad727bb1-kube-api-access-qn52n\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675481 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cni-binary-copy\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675497 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/470f170b-eeab-4f43-bd48-18e50771289a-rootfs\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675511 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-os-release\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675525 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675550 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.675564 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-daemon-config\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.686912 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.699319 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.708693 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.730225 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.753265 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.769225 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776838 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-os-release\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776871 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776895 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776911 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-daemon-config\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776932 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-multus\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776945 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-system-cni-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776958 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-k8s-cni-cncf-io\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776973 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-multus-certs\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.776987 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-system-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777002 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5758q\" (UniqueName: \"kubernetes.io/projected/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-kube-api-access-5758q\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777016 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-socket-dir-parent\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777033 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-etc-kubernetes\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777046 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-hostroot\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777066 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cnibin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777080 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-kubelet\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777094 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-bin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777112 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470f170b-eeab-4f43-bd48-18e50771289a-proxy-tls\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777127 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-netns\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777141 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777183 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-os-release\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777198 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-conf-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777214 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-cnibin\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777229 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777253 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4bh\" (UniqueName: \"kubernetes.io/projected/470f170b-eeab-4f43-bd48-18e50771289a-kube-api-access-jx4bh\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777269 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470f170b-eeab-4f43-bd48-18e50771289a-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777283 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn52n\" (UniqueName: \"kubernetes.io/projected/b7e63b3d-a855-4971-8a5a-995fad727bb1-kube-api-access-qn52n\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777310 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cni-binary-copy\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/470f170b-eeab-4f43-bd48-18e50771289a-rootfs\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777377 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/470f170b-eeab-4f43-bd48-18e50771289a-rootfs\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777427 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-socket-dir-parent\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777451 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-etc-kubernetes\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777493 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-hostroot\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777521 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cnibin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777542 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-kubelet\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777562 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-bin\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777661 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777681 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777730 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-cnibin\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777760 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-netns\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778018 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-k8s-cni-cncf-io\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778049 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-var-lib-cni-multus\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778071 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-system-cni-dir\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778093 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-host-run-multus-certs\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.777179 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7e63b3d-a855-4971-8a5a-995fad727bb1-os-release\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778174 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-os-release\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778271 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-system-cni-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778316 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778359 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-conf-dir\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.778401 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-multus-daemon-config\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.779093 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-cni-binary-copy\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.779586 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7e63b3d-a855-4971-8a5a-995fad727bb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.781692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470f170b-eeab-4f43-bd48-18e50771289a-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.783025 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470f170b-eeab-4f43-bd48-18e50771289a-proxy-tls\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.785475 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.793670 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4bh\" (UniqueName: \"kubernetes.io/projected/470f170b-eeab-4f43-bd48-18e50771289a-kube-api-access-jx4bh\") pod \"machine-config-daemon-cp4w9\" (UID: \"470f170b-eeab-4f43-bd48-18e50771289a\") " pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.793918 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5758q\" (UniqueName: \"kubernetes.io/projected/0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f-kube-api-access-5758q\") pod \"multus-h54fr\" (UID: \"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\") " pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.794316 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn52n\" (UniqueName: \"kubernetes.io/projected/b7e63b3d-a855-4971-8a5a-995fad727bb1-kube-api-access-qn52n\") pod \"multus-additional-cni-plugins-hmfxk\" (UID: \"b7e63b3d-a855-4971-8a5a-995fad727bb1\") " pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.799145 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.819623 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.831375 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.842383 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.853158 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.861031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.878799 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.899644 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.914156 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.927127 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.948399 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.948733 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h54fr" Dec 01 00:07:41 crc kubenswrapper[4911]: W1201 00:07:41.959910 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fefe746_bc51_4bb4_a9b9_cc3dd29c2c0f.slice/crio-0ac57f372c54f8c126f4b073c0398536fbdac5157e00e517b08f7f0afc639a72 WatchSource:0}: Error finding container 0ac57f372c54f8c126f4b073c0398536fbdac5157e00e517b08f7f0afc639a72: Status 404 returned error can't find the container with id 0ac57f372c54f8c126f4b073c0398536fbdac5157e00e517b08f7f0afc639a72 Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.964905 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.978120 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:41Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.985609 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" Dec 01 00:07:41 crc kubenswrapper[4911]: I1201 00:07:41.987405 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:07:41 crc kubenswrapper[4911]: W1201 00:07:41.997070 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e63b3d_a855_4971_8a5a_995fad727bb1.slice/crio-cef2fdb7405696dec22e8a21a3dfc0dda1445655edb43869339fd7e2b440597c WatchSource:0}: Error finding container cef2fdb7405696dec22e8a21a3dfc0dda1445655edb43869339fd7e2b440597c: Status 404 returned error can't find the container with id cef2fdb7405696dec22e8a21a3dfc0dda1445655edb43869339fd7e2b440597c Dec 01 00:07:42 crc kubenswrapper[4911]: W1201 00:07:42.001332 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod470f170b_eeab_4f43_bd48_18e50771289a.slice/crio-71ea8cd4e873568044a747f6124d417a81b75553c18ec97f758f94dd60819d33 WatchSource:0}: Error finding container 71ea8cd4e873568044a747f6124d417a81b75553c18ec97f758f94dd60819d33: Status 404 returned error can't find the container with id 71ea8cd4e873568044a747f6124d417a81b75553c18ec97f758f94dd60819d33 Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.048418 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ptrhz"] Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.049231 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.051344 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.051710 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.051913 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.053373 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.054165 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.058053 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.058210 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.066209 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.079443 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.079567 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.079595 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079626 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:07:44.079596703 +0000 UTC m=+24.218293524 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079718 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079779 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079796 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:44.079776717 +0000 UTC m=+24.218473568 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079803 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079816 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.079864 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:44.079848959 +0000 UTC m=+24.218545730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.081600 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.104231 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.124856 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.145789 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.151228 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.151392 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.151247 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.151523 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.155049 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.156039 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.157222 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.157854 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.158983 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.158994 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.159563 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.160112 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.161324 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.162758 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.163769 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.164296 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.165433 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.166047 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.166675 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.167569 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.168065 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.169247 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.169678 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.170412 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.171630 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.172098 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.173040 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.173491 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.174609 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.175041 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.175640 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.176684 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.177141 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.178139 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.178636 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.179599 4911 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.179702 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180586 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180627 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180652 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180675 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgxc\" (UniqueName: \"kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180695 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180725 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180764 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180785 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180822 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180843 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180864 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180887 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180909 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180926 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180948 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180967 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.180985 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.181005 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.181023 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.181041 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.181061 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181216 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181266 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:44.181249307 +0000 UTC m=+24.319946088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.181279 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181400 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181776 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181790 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:42 crc kubenswrapper[4911]: E1201 00:07:42.181824 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:44.18181417 +0000 UTC m=+24.320510941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.182231 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.182822 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.184350 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.185052 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.186007 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.186647 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.187629 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.188102 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.189082 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.189618 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.190202 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.190806 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.191662 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.193288 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.194322 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.195048 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.195929 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.196422 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.196991 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.197912 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.198473 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.199309 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.199609 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-01 00:02:41 +0000 UTC, rotation deadline is 2026-09-22 17:30:12.158022716 +0000 UTC Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.199660 4911 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7097h22m29.958365137s for next certificate rotation Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.209140 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.225567 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.252004 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.280349 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281713 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281761 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281786 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281810 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281828 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281838 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281850 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281889 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281914 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281932 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281933 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281952 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281962 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281975 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281982 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.281998 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282014 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282032 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282058 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282080 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282099 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282113 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trgxc\" (UniqueName: \"kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282135 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282151 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282165 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282216 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282617 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282701 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282721 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282736 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282771 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282823 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.282767 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.283200 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.283827 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.284033 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.288514 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.306889 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgxc\" (UniqueName: \"kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc\") pod \"ovnkube-node-ptrhz\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.316240 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.345735 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.360329 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:42 crc kubenswrapper[4911]: W1201 00:07:42.373654 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8af6f05_3ccd_4b80_b144_530b83bfdc62.slice/crio-0cc0cae2b87e99de99af6e4b7b6f16b5a7cdf4913ebace932223268d99736127 WatchSource:0}: Error finding container 0cc0cae2b87e99de99af6e4b7b6f16b5a7cdf4913ebace932223268d99736127: Status 404 returned error can't find the container with id 0cc0cae2b87e99de99af6e4b7b6f16b5a7cdf4913ebace932223268d99736127 Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.408396 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.596380 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.596436 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.596451 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"71ea8cd4e873568044a747f6124d417a81b75553c18ec97f758f94dd60819d33"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.598968 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.600898 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.601195 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.602623 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pt7lz" event={"ID":"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40","Type":"ContainerStarted","Data":"f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.603825 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerStarted","Data":"500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.603946 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerStarted","Data":"0ac57f372c54f8c126f4b073c0398536fbdac5157e00e517b08f7f0afc639a72"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.605602 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" exitCode=0 Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.605664 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.605877 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"0cc0cae2b87e99de99af6e4b7b6f16b5a7cdf4913ebace932223268d99736127"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.607217 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8ml8w" event={"ID":"68489275-7ca7-441e-9591-bf6993da0b1a","Type":"ContainerStarted","Data":"1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.609109 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68" exitCode=0 Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.609137 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.609150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerStarted","Data":"cef2fdb7405696dec22e8a21a3dfc0dda1445655edb43869339fd7e2b440597c"} Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.619898 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.636220 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.654186 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.668947 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.681712 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.703591 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.725161 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.743391 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.780360 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.806331 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.819960 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.832763 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.869574 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.885433 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.895939 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.907924 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.930616 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.963604 4911 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.966102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.966157 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.966174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.966375 4911 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.980727 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:42Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.983268 4911 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.983648 4911 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.984908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.984949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.984959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.984977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:42 crc kubenswrapper[4911]: I1201 00:07:42.984987 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:42Z","lastTransitionTime":"2025-12-01T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.027297 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.033979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.034016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.034028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.034044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.034054 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.039716 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.046005 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.049416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.049489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.049499 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.049521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.049534 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.065652 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.074904 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.074932 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.075005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.075017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.075037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.075048 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.086397 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.089854 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.089890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.089899 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.089926 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.089937 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.114301 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.129203 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.129365 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.131004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.131046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.131057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.131076 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.131088 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.151136 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:43 crc kubenswrapper[4911]: E1201 00:07:43.151275 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.154647 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.190839 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.249240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.249286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.249299 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.249319 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.249331 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.269645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.284721 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.318593 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.352272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.352312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.352323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.352339 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.352350 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.354870 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.392802 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.404792 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.409941 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.413236 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.455100 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.455156 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.455167 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.455181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.455191 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.459478 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.491723 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.529155 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.558610 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.558648 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.558656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.558673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.558682 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.569406 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.609607 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.617265 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.617314 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.617326 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.617336 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.617346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.619348 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc" exitCode=0 Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.619407 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.620737 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.653647 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.661952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.661977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.661986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.662001 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.662012 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.693320 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.733270 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.764772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.764812 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.764821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.764836 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.764847 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.768987 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.811358 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.850219 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.867588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.867623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.867636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.867655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.867668 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.893481 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.932877 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.970125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.970178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.970189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.970209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.970223 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:43Z","lastTransitionTime":"2025-12-01T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:43 crc kubenswrapper[4911]: I1201 00:07:43.977438 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:43Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.011669 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.049894 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.072775 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.072812 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.072822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.072837 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.072848 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.090471 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.105727 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.105846 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.105883 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:07:48.105854958 +0000 UTC m=+28.244551729 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.105931 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.105964 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.105983 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.105996 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.106039 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:48.106024652 +0000 UTC m=+28.244721433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.106167 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.106268 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:48.106247709 +0000 UTC m=+28.244944560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.130524 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.150944 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.151107 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.151173 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.151268 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.168098 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.175842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.175897 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.175907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.175930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.175947 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.207214 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.207274 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207386 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207427 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207501 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207520 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207442 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:48.207428209 +0000 UTC m=+28.346124980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:44 crc kubenswrapper[4911]: E1201 00:07:44.207604 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:48.207583384 +0000 UTC m=+28.346280155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.213636 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.272715 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.278356 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.278416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.278429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.278481 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.278498 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.306031 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.339935 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.381218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.381267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.381279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.381298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.381310 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.409255 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.432858 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.452656 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.483477 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.483520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.483531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.483548 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.483558 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.495605 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.532875 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.572857 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.586319 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.586378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.586396 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.586425 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.586439 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.629657 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.632580 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28" exitCode=0 Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.632674 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.653126 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.664687 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.689959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.689999 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.690009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.690024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.690033 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.702024 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.732609 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.774593 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.792704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.792747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.792756 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.792771 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.792780 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.812428 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.849740 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.895530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.895568 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.895581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.895600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.895612 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.900687 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.932917 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.973339 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.999173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.999234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.999250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.999275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:44 crc kubenswrapper[4911]: I1201 00:07:44.999295 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:44Z","lastTransitionTime":"2025-12-01T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.014635 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.057684 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.092823 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.102643 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.102717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.102742 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.102776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.102801 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.141378 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.151395 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:45 crc kubenswrapper[4911]: E1201 00:07:45.151623 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.178793 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.206531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.206582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.206592 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.206614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.206627 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.309880 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.309939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.309958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.309987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.310006 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.412440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.412534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.412553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.412605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.412625 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.516572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.516627 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.516644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.516671 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.516690 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.619730 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.620090 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.620108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.620135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.620155 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.640877 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736" exitCode=0 Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.640961 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.662325 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.681513 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.699151 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.725250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.725294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.725308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.725328 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.725342 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.791717 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.804863 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.818059 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.827426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.827512 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.827528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.827550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.827566 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.834241 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.848005 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.860334 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.876443 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.888913 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.904273 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.924061 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.930951 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.930984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.930993 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.931009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.931020 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:45Z","lastTransitionTime":"2025-12-01T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.948293 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:45 crc kubenswrapper[4911]: I1201 00:07:45.960888 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.033791 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.033825 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.033835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.033863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.033873 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.136772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.137117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.137269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.137442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.137628 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.151532 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.151580 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:46 crc kubenswrapper[4911]: E1201 00:07:46.152014 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:46 crc kubenswrapper[4911]: E1201 00:07:46.152178 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.240979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.241215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.241289 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.241383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.241479 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.344318 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.344378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.344402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.344433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.344489 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.447366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.447717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.447833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.448124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.448196 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.551183 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.551238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.551250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.551272 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.551285 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.650410 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.653439 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.653503 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.653514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.653534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.653564 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.654202 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298" exitCode=0 Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.654255 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.670450 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.686431 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.709569 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.725572 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.738377 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.756941 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.756998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.757010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.757031 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.757045 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.760187 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.776900 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.791764 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.806098 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.825783 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.849552 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.859548 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.859578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.859590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.859606 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.859618 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.864799 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.878518 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.895768 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.909305 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.962834 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.962883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.962896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.962919 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:46 crc kubenswrapper[4911]: I1201 00:07:46.962934 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:46Z","lastTransitionTime":"2025-12-01T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.068362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.068436 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.068490 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.068521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.068540 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.151072 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:47 crc kubenswrapper[4911]: E1201 00:07:47.151288 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.173587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.173643 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.173660 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.173686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.173706 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.277599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.277644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.277654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.277672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.277684 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.381831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.382298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.382492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.382643 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.382834 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.486158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.486243 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.486269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.486298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.486317 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.589175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.589220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.589234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.589253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.589267 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.663166 4911 generic.go:334] "Generic (PLEG): container finished" podID="b7e63b3d-a855-4971-8a5a-995fad727bb1" containerID="ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092" exitCode=0 Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.663241 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerDied","Data":"ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.684996 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.691654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.691698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.691717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.691745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.691765 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.702242 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.728490 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.760980 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.781151 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.794554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.794617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.794637 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.794669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.794698 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.798301 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.813575 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.826812 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.842092 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.855015 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.869782 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.885005 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.895081 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.897252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.897295 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.897313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.897334 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.897350 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:47Z","lastTransitionTime":"2025-12-01T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.914615 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:47 crc kubenswrapper[4911]: I1201 00:07:47.928699 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:47Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.000526 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.000588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.000610 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.000642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.000666 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.103392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.103500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.103524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.103554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.103574 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.150891 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.150938 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.151091 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.151249 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.160192 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.160380 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.160425 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160640 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160709 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:56.160688292 +0000 UTC m=+36.299385103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160806 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:07:56.160792125 +0000 UTC m=+36.299488926 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160901 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160938 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.160957 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.161000 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:56.160987121 +0000 UTC m=+36.299683932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.207135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.207207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.207224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.207297 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.207318 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.261839 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.261943 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262139 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262165 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262184 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262256 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:56.262233443 +0000 UTC m=+36.400930254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262320 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: E1201 00:07:48.262360 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:07:56.262346976 +0000 UTC m=+36.401043777 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.310171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.310230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.310249 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.310274 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.310292 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.421063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.421508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.421521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.421540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.421552 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.523858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.523907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.523923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.523944 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.523959 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.626294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.626348 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.626362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.626384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.626401 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.729736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.729785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.729797 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.729817 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.729830 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.833186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.833237 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.833259 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.833285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.833306 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.942968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.943010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.943023 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.943041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:48 crc kubenswrapper[4911]: I1201 00:07:48.943053 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:48Z","lastTransitionTime":"2025-12-01T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.045819 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.045871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.045889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.045915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.045932 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.149073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.149129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.149153 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.149181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.149201 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.151411 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:49 crc kubenswrapper[4911]: E1201 00:07:49.151574 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.252224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.252277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.252296 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.252323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.252340 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.355098 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.355137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.355146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.355161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.355173 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.458522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.458588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.458608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.458635 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.458652 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.561049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.561106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.561116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.561137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.561149 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.663887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.663960 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.663985 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.664020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.664046 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.677034 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.677494 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.677573 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.684446 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" event={"ID":"b7e63b3d-a855-4971-8a5a-995fad727bb1","Type":"ContainerStarted","Data":"846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.711482 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.717523 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.722539 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.733731 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.754516 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.767657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.767739 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.767756 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.767834 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.767903 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.773086 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.786241 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.806130 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.826436 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.846501 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.869110 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.872101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.872162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.872176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.872189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.872201 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.879683 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.893194 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.905947 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.919211 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.951166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:49Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.975441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.975508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.975522 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.975541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.975555 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:49Z","lastTransitionTime":"2025-12-01T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.991116 4911 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 01 00:07:49 crc kubenswrapper[4911]: I1201 00:07:49.991921 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ovn-kubernetes/pods/ovnkube-node-ptrhz/status\": read tcp 38.102.83.198:58494->38.102.83.198:6443: use of closed network connection" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.030826 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.046027 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.060013 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.071795 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.078928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.078975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.078986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.079004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.079026 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.084161 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.108918 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.128680 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.142690 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.150843 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.150956 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:50 crc kubenswrapper[4911]: E1201 00:07:50.151076 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:50 crc kubenswrapper[4911]: E1201 00:07:50.151411 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.157309 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.173755 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.182031 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.182081 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.182096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.182138 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.182172 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.189082 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.205942 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.218841 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.234107 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.252084 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.270833 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.285531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.285571 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.285582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.285601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.285615 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.286246 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.304183 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.319133 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.337513 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.352095 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.365350 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.383445 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.387687 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.387715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.387724 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.387738 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.387747 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.399096 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.419911 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.434035 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.457978 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.472848 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.490697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.490765 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.490783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.490814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.490832 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.491765 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.506573 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.593269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.593342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.593362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.593385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.593404 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.694286 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.696407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.696432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.696442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.696472 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.696483 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.800345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.800390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.800404 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.800424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.800438 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.903018 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.903055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.903068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.903085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:50 crc kubenswrapper[4911]: I1201 00:07:50.903096 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:50Z","lastTransitionTime":"2025-12-01T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.009435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.009625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.009641 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.009658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.009669 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.111705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.111736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.111744 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.111759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.111768 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.151550 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:51 crc kubenswrapper[4911]: E1201 00:07:51.151681 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.214698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.214750 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.214759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.214771 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.214782 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.317359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.317425 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.317444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.317492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.317512 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.420939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.420977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.420991 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.421015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.421030 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.525957 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.526045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.526068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.526099 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.526129 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.630045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.630119 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.630143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.630173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.630195 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.698351 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.733929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.733994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.734011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.734035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.734052 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.837612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.837672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.837698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.837726 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.837749 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.941877 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.941947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.941971 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.942002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:51 crc kubenswrapper[4911]: I1201 00:07:51.942027 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:51Z","lastTransitionTime":"2025-12-01T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.045279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.045331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.045345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.045364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.045378 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.149137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.149207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.149226 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.149253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.149273 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.151520 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.151604 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:52 crc kubenswrapper[4911]: E1201 00:07:52.151705 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:52 crc kubenswrapper[4911]: E1201 00:07:52.151789 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.252558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.252613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.252633 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.252659 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.252679 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.356706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.356795 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.356819 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.356852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.356880 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.459894 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.459952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.459970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.459992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.460011 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.562880 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.562957 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.562976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.563002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.563024 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.666780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.666842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.666869 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.666901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.666922 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.706560 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/0.log" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.712522 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12" exitCode=1 Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.712605 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.714047 4911 scope.go:117] "RemoveContainer" containerID="cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.739753 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.767397 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.769788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.769836 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.769854 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.769878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.769896 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.782182 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.798642 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.815694 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.841286 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.857912 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.873067 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.873128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.873151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.873176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.873196 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.874992 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.889663 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.901101 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.924877 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.945915 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.962595 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.977264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.977321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.977341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.977362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.977380 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:52Z","lastTransitionTime":"2025-12-01T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.978080 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:52 crc kubenswrapper[4911]: I1201 00:07:52.991878 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:52Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.079410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.079455 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.079494 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.079514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.079526 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.151679 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.151872 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.182086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.182134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.182146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.182165 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.182177 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.284556 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.284595 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.284603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.284617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.284629 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.375319 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.375367 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.375377 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.375395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.375406 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.386156 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.389624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.389655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.389665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.389681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.389691 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.399999 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.403182 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.403216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.403225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.403239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.403248 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.413152 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.415976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.416012 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.416024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.416042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.416053 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.427349 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.430433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.430477 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.430492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.430508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.430520 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.441279 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: E1201 00:07:53.441439 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.442986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.443015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.443026 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.443064 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.443098 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.545206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.545249 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.545259 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.545275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.545288 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.648678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.648719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.648730 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.648748 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.648766 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.718737 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/0.log" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.721552 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.721683 4911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.738056 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.750407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.750442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.750470 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.750487 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.750498 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.758156 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.772540 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.796529 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.814144 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.839285 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.853707 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.853764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.853782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.853809 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.853827 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.857231 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.880067 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.913301 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.934634 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.954594 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.956971 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.957230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.957452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.957725 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.957951 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:53Z","lastTransitionTime":"2025-12-01T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.971657 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:53 crc kubenswrapper[4911]: I1201 00:07:53.986172 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.003060 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.021804 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.061128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.061166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.061182 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.061205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.061222 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.151730 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.151799 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:54 crc kubenswrapper[4911]: E1201 00:07:54.151946 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:54 crc kubenswrapper[4911]: E1201 00:07:54.152108 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.163815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.163885 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.163908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.163936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.163958 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.267326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.267384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.267402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.267429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.267517 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.362929 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.370545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.370608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.370626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.370659 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.370678 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.473437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.473518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.473537 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.473566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.473583 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.576296 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.576630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.576673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.576699 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.576718 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.680163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.680255 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.680279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.680312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.680338 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.783032 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.783092 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.783111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.783136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.783153 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.886625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.886678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.886695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.886723 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.886744 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.990249 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.990592 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.990754 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.991038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:54 crc kubenswrapper[4911]: I1201 00:07:54.991337 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:54Z","lastTransitionTime":"2025-12-01T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.011361 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2"] Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.012158 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.015192 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.015901 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.039190 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.062560 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.095579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.095657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.095684 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.095717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.095748 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.096260 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.135972 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.141162 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.141239 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.141318 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2659d33d-d79b-4e62-845f-f1538638f390-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.141376 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7xcw\" (UniqueName: \"kubernetes.io/projected/2659d33d-d79b-4e62-845f-f1538638f390-kube-api-access-f7xcw\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.150996 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:55 crc kubenswrapper[4911]: E1201 00:07:55.151224 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.158865 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.182633 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.199848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.199913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.199927 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.199951 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.199970 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.206666 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.222774 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.239355 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.242533 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.242684 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.242722 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2659d33d-d79b-4e62-845f-f1538638f390-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.242771 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7xcw\" (UniqueName: \"kubernetes.io/projected/2659d33d-d79b-4e62-845f-f1538638f390-kube-api-access-f7xcw\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.243800 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.244327 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2659d33d-d79b-4e62-845f-f1538638f390-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.254071 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2659d33d-d79b-4e62-845f-f1538638f390-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.261965 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.267378 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7xcw\" (UniqueName: \"kubernetes.io/projected/2659d33d-d79b-4e62-845f-f1538638f390-kube-api-access-f7xcw\") pod \"ovnkube-control-plane-749d76644c-g2rl2\" (UID: \"2659d33d-d79b-4e62-845f-f1538638f390\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.283312 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.300999 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.303802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.304018 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.304170 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.304313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.304443 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.320273 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.335843 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.350224 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.369176 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.381592 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.408937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.408985 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.408998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.409021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.409033 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.512740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.512795 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.512807 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.512831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.512846 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.616364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.616418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.616431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.616479 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.616496 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.720708 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.720819 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.720838 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.720870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.720889 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.733045 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/1.log" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.734588 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/0.log" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.741334 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9" exitCode=1 Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.741436 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.741513 4911 scope.go:117] "RemoveContainer" containerID="cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.742785 4911 scope.go:117] "RemoveContainer" containerID="bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9" Dec 01 00:07:55 crc kubenswrapper[4911]: E1201 00:07:55.743084 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.744943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" event={"ID":"2659d33d-d79b-4e62-845f-f1538638f390","Type":"ContainerStarted","Data":"6555bff138f6c38af8a229183d67c6e8d3114db1de3d3b260918d04eaf8c8161"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.769083 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.792320 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.811964 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.825407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.825506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.825533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.825570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.825596 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.837028 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.857879 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.874853 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.894179 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.908854 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.927946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.927995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.928010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.928035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.928049 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:55Z","lastTransitionTime":"2025-12-01T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.943327 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.959936 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:55 crc kubenswrapper[4911]: I1201 00:07:55.987002 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:55Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.008215 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.018993 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.027246 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.030193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.030251 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.030268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.030294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.030311 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.044242 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.063034 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.082222 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.105276 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.128408 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.133107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.133171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.133184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.133206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.133220 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.144513 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.151608 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.151608 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.151768 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.151810 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.160232 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.175291 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.186781 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.208441 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.223220 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.235213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.235247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.235255 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.235271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.235282 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.242622 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.254299 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.254406 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.254434 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254572 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254591 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254603 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254648 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.254594661 +0000 UTC m=+52.393291482 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254716 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.254695033 +0000 UTC m=+52.393391934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.254899 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.255019 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.254995112 +0000 UTC m=+52.393691933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.270470 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.296056 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.310606 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.325188 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337808 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337962 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.337974 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.352361 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.356032 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.356134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356292 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356347 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356435 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356452 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356403 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.356375958 +0000 UTC m=+52.495072769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.356571 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.356551353 +0000 UTC m=+52.495248124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.370375 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.441539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.441600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.441625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.441659 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.441682 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.505117 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bzs4g"] Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.505575 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.505640 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.521130 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.538623 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.544283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.544312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.544320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.544335 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.544344 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.553133 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.567525 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.584877 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.604499 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.622851 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.638302 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.650239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.650302 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.650331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.650362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.650380 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.659213 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.659300 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9mz7\" (UniqueName: \"kubernetes.io/projected/10941e4a-3eac-4ef3-a814-c83adcea347e-kube-api-access-w9mz7\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.667870 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.703804 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.719075 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.738170 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.752664 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" event={"ID":"2659d33d-d79b-4e62-845f-f1538638f390","Type":"ContainerStarted","Data":"bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.752713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" event={"ID":"2659d33d-d79b-4e62-845f-f1538638f390","Type":"ContainerStarted","Data":"f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.753075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.753095 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.753106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.753121 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.753133 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.755343 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.756165 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/1.log" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.759837 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.759891 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9mz7\" (UniqueName: \"kubernetes.io/projected/10941e4a-3eac-4ef3-a814-c83adcea347e-kube-api-access-w9mz7\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.760089 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: E1201 00:07:56.760169 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:07:57.260143534 +0000 UTC m=+37.398840335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.771323 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.787611 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.791299 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9mz7\" (UniqueName: \"kubernetes.io/projected/10941e4a-3eac-4ef3-a814-c83adcea347e-kube-api-access-w9mz7\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.808583 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.824206 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.842376 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.855452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.855524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.855539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.855560 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.855587 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.865887 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.894732 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.912252 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.940169 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.958158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.958189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.958200 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.958213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.958223 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:56Z","lastTransitionTime":"2025-12-01T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.960688 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.974140 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:56 crc kubenswrapper[4911]: I1201 00:07:56.987645 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.001450 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:56Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.019144 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.031189 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.047383 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.061037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.061077 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.061088 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.061105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.061114 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.062594 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.073115 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.085155 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.097878 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.111069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:57Z is after 2025-08-24T17:21:41Z" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.151284 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:57 crc kubenswrapper[4911]: E1201 00:07:57.151442 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.164661 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.165086 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.165106 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.165134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.165151 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.264888 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:57 crc kubenswrapper[4911]: E1201 00:07:57.265071 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:57 crc kubenswrapper[4911]: E1201 00:07:57.265152 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:07:58.26513071 +0000 UTC m=+38.403827511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.267701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.267773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.267792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.267820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.267838 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.371243 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.371293 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.371309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.371333 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.371351 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.474329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.474379 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.474401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.474426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.474444 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.581151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.581831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.581865 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.581891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.581909 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.685432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.685562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.685588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.685619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.685636 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.788543 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.788596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.788614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.788636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.788654 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.891406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.891496 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.891509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.891530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.891543 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.994200 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.994261 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.994278 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.994300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:57 crc kubenswrapper[4911]: I1201 00:07:57.994316 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:57Z","lastTransitionTime":"2025-12-01T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.096868 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.096959 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.096987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.097020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.097045 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.151401 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.151419 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.151638 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:07:58 crc kubenswrapper[4911]: E1201 00:07:58.151814 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:07:58 crc kubenswrapper[4911]: E1201 00:07:58.151962 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:07:58 crc kubenswrapper[4911]: E1201 00:07:58.152569 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.200196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.200267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.200289 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.200325 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.200348 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.276029 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:07:58 crc kubenswrapper[4911]: E1201 00:07:58.276364 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:58 crc kubenswrapper[4911]: E1201 00:07:58.276550 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:08:00.276517485 +0000 UTC m=+40.415214296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.303823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.303890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.303908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.303935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.303953 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.407410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.407795 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.407930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.408117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.408311 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.511641 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.511694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.511717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.511745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.511766 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.614101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.614161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.614177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.614201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.614219 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.717534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.717593 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.717604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.717622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.717635 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.820401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.820538 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.820584 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.820615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.820638 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.922995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.923044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.923061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.923089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:58 crc kubenswrapper[4911]: I1201 00:07:58.923106 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:58Z","lastTransitionTime":"2025-12-01T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.025934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.025994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.026017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.026044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.026064 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.129622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.129689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.129709 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.129736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.129753 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.150929 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:07:59 crc kubenswrapper[4911]: E1201 00:07:59.151119 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.232720 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.232784 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.232801 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.232824 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.232842 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.336108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.336166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.336190 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.336222 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.336246 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.439315 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.439387 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.439410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.439440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.439503 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.542980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.543496 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.543639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.543785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.543912 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.647689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.647785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.647809 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.647845 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.647863 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.750886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.750966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.750990 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.751020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.751043 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.854702 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.854768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.854802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.854831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.854852 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.957905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.958000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.958023 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.958051 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:07:59 crc kubenswrapper[4911]: I1201 00:07:59.958071 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:07:59Z","lastTransitionTime":"2025-12-01T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.061337 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.061390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.061402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.061422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.061436 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.151096 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.151566 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.151679 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:00 crc kubenswrapper[4911]: E1201 00:08:00.151816 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:00 crc kubenswrapper[4911]: E1201 00:08:00.151987 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:00 crc kubenswrapper[4911]: E1201 00:08:00.151770 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.164128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.164345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.164551 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.164733 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.164874 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.174445 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.191556 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.223185 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.244085 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.266557 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.267861 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.267912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.267928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.267952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.267971 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.283715 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.300124 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:00 crc kubenswrapper[4911]: E1201 00:08:00.300307 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:00 crc kubenswrapper[4911]: E1201 00:08:00.300591 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:08:04.300563108 +0000 UTC m=+44.439259919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.301743 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.318784 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.333285 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.356983 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.370865 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.370921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.370950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.370980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.371001 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.375225 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.394627 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.415087 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.436018 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.474532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.474581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.474597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.474621 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.474637 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.475844 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.496416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.516239 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:00Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.578541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.578618 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.578645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.578675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.578697 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.681339 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.681406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.681425 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.681450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.681500 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.784607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.784655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.784671 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.784694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.784712 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.888364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.888432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.888455 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.888529 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.888550 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.991814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.991861 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.991870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.991887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:00 crc kubenswrapper[4911]: I1201 00:08:00.991897 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:00Z","lastTransitionTime":"2025-12-01T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.094782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.094849 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.094865 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.094896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.094913 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.151731 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:01 crc kubenswrapper[4911]: E1201 00:08:01.151940 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.197823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.197896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.197915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.197940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.197958 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.301406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.301508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.301540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.301572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.301594 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.404254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.404321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.404339 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.404365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.404386 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.508257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.508317 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.508331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.508353 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.508366 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.611699 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.611767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.611785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.611811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.611830 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.714905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.714953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.714965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.714985 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.714998 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.817989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.818062 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.818085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.818146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.818165 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.921401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.921511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.921535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.921565 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:01 crc kubenswrapper[4911]: I1201 00:08:01.921589 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:01Z","lastTransitionTime":"2025-12-01T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.024085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.024130 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.024142 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.024160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.024175 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.130195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.130254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.130271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.130295 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.130309 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.151723 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:02 crc kubenswrapper[4911]: E1201 00:08:02.151843 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.151723 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.151990 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:02 crc kubenswrapper[4911]: E1201 00:08:02.152128 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:02 crc kubenswrapper[4911]: E1201 00:08:02.152381 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.232685 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.232738 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.232753 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.232772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.232785 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.335323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.335381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.335397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.335491 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.335511 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.438390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.438433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.438444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.438510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.438567 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.541103 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.541173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.541196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.541228 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.541249 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.644205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.644263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.644278 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.644297 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.644310 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.747835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.747918 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.747944 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.747975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.747998 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.850902 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.850979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.850993 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.851022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.851034 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.953948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.954017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.954040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.954068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:02 crc kubenswrapper[4911]: I1201 00:08:02.954087 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:02Z","lastTransitionTime":"2025-12-01T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.057835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.057892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.057910 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.057935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.057952 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.151713 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.151960 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.161311 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.161369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.161391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.161420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.161442 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.264976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.265013 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.265020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.265035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.265044 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.367914 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.367975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.367988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.368005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.368016 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.456952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.456987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.456999 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.457015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.457027 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.472519 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:03Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.482184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.482229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.482243 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.482263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.482276 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.499765 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:03Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.505115 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.505175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.505186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.505206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.505225 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.520361 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:03Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.525811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.525874 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.525896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.525920 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.525936 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.541767 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:03Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.549830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.550110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.550267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.550515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.550676 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.565664 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:03Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:03 crc kubenswrapper[4911]: E1201 00:08:03.565781 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.567519 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.567694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.567792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.567910 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.568029 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.671658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.671987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.672152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.672329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.672541 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.775346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.775387 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.775403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.775423 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.775438 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.878117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.878188 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.878211 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.878239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.878261 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.981202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.981235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.981244 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.981258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:03 crc kubenswrapper[4911]: I1201 00:08:03.981269 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:03Z","lastTransitionTime":"2025-12-01T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.084041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.084097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.084116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.084140 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.084158 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.151229 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.151253 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:04 crc kubenswrapper[4911]: E1201 00:08:04.151418 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:04 crc kubenswrapper[4911]: E1201 00:08:04.151592 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.151667 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:04 crc kubenswrapper[4911]: E1201 00:08:04.151819 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.186840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.186893 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.186911 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.186934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.186951 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.289926 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.289966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.289979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.289998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.290010 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.349957 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:04 crc kubenswrapper[4911]: E1201 00:08:04.350239 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:04 crc kubenswrapper[4911]: E1201 00:08:04.350372 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:08:12.350338643 +0000 UTC m=+52.489035444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.392397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.392452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.392502 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.392520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.392531 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.495190 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.495260 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.495277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.495307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.495328 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.598311 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.598366 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.598382 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.598406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.598426 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.701549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.701612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.701634 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.701663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.701686 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.804834 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.805201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.805417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.805669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.805848 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.908716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.908781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.908801 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.908827 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:04 crc kubenswrapper[4911]: I1201 00:08:04.908846 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:04Z","lastTransitionTime":"2025-12-01T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.011554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.012063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.012248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.012407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.012573 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.115358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.115783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.115962 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.116162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.116326 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.150990 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:05 crc kubenswrapper[4911]: E1201 00:08:05.151182 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.220051 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.220554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.221011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.221443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.221942 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.325286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.325351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.325379 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.325413 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.325437 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.428219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.428287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.428314 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.428342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.428363 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.531672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.531786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.531823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.531853 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.531875 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.635293 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.635342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.635359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.635384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.635401 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.738083 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.738129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.738143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.738164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.738179 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.840674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.840737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.840754 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.840777 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.840793 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.944428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.944475 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.944525 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.944549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:05 crc kubenswrapper[4911]: I1201 00:08:05.944567 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:05Z","lastTransitionTime":"2025-12-01T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.047822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.047874 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.047892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.047918 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.047935 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.150695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.150743 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.150758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.150781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.150801 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.253267 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.253305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.253320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.253344 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.253359 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.266829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:06 crc kubenswrapper[4911]: E1201 00:08:06.266955 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.267356 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:06 crc kubenswrapper[4911]: E1201 00:08:06.267428 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.267509 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:06 crc kubenswrapper[4911]: E1201 00:08:06.267574 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.355531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.355601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.355614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.355629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.355640 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.458381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.458412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.458422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.458478 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.458537 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.561262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.561329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.561353 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.561383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.561405 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.664025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.664089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.664109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.664133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.664165 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.766995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.767067 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.767087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.767110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.767128 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.870263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.870313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.870329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.870352 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.870369 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.973740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.973989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.974008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.974035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:06 crc kubenswrapper[4911]: I1201 00:08:06.974053 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:06Z","lastTransitionTime":"2025-12-01T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.076788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.076832 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.076842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.076858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.076868 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.151056 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:07 crc kubenswrapper[4911]: E1201 00:08:07.151602 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.179706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.179788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.179813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.179848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.179872 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.281879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.281917 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.281930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.281947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.281957 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.384343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.384381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.384393 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.384409 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.384420 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.486195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.486268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.486294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.486331 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.486356 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.589737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.589811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.589834 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.589865 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.589886 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.692414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.692501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.692537 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.692569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.692593 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.795372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.795641 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.795722 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.795822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.795970 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.899762 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.899835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.899856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.899900 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:07 crc kubenswrapper[4911]: I1201 00:08:07.899922 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:07Z","lastTransitionTime":"2025-12-01T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.003415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.003501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.003521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.003555 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.003579 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.106783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.107097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.107201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.107291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.107385 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.151552 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.151622 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.151693 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:08 crc kubenswrapper[4911]: E1201 00:08:08.151849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:08 crc kubenswrapper[4911]: E1201 00:08:08.151999 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:08 crc kubenswrapper[4911]: E1201 00:08:08.152119 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.210417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.210843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.210993 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.211146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.211288 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.314727 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.315110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.315354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.315599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.315784 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.419132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.419507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.419611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.419725 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.419820 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.523093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.523711 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.523759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.523781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.523796 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.626906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.626963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.626981 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.627007 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.627024 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.730419 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.730506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.730524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.730547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.730561 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.832553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.832612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.832630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.832660 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.832677 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.936432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.936565 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.936591 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.936621 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:08 crc kubenswrapper[4911]: I1201 00:08:08.936646 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:08Z","lastTransitionTime":"2025-12-01T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.039281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.039349 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.039369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.039395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.039415 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.142644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.142747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.142773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.142813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.142840 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.150978 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:09 crc kubenswrapper[4911]: E1201 00:08:09.151154 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.245873 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.245936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.245952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.245975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.245993 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.349057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.349109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.349121 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.349144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.349158 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.451883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.451955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.451980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.452012 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.452036 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.489946 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.499406 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.516859 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.538149 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.554745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.554783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.554794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.554814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.554825 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.557632 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.580527 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.595395 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.615475 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.632156 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.647381 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.658617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.658697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.658723 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.658759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.658784 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.678131 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.694323 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.713136 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.727368 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.738716 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.761623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.761680 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.761695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.761719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.761737 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.773720 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.786313 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.799130 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.810678 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:09Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.864615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.864681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.864693 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.864711 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.864723 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.967688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.967770 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.967792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.967822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:09 crc kubenswrapper[4911]: I1201 00:08:09.967844 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:09Z","lastTransitionTime":"2025-12-01T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.071508 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.071564 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.071581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.071607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.071624 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.150726 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:10 crc kubenswrapper[4911]: E1201 00:08:10.150846 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.150970 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:10 crc kubenswrapper[4911]: E1201 00:08:10.151094 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.151174 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:10 crc kubenswrapper[4911]: E1201 00:08:10.151240 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.168353 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.174908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.174979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.175004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.175032 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.175055 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.234434 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5afaf8fcc9da2f16173b599233f0f8b7e7e9ab3b64b9b4caae08d9deab4a12\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:51Z\\\",\\\"message\\\":\\\"r.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00:07:51.352815 6194 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:07:51.352839 6194 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:07:51.352769 6194 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 00:07:51.352987 6194 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.352998 6194 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 00:07:51.353036 6194 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 00:07:51.353083 6194 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 00:07:51.353245 6194 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 00:07:51.353566 6194 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:07:51.353583 6194 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 00:07:51.353691 6194 factory.go:656] Stopping watch factory\\\\nI1201 00:07:51.353707 6194 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.254018 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.272603 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.278024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.278060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.278072 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.278090 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.278102 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.289306 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.304928 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.319030 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.350415 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.366921 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382404 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382574 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382840 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.382898 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.398162 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.418770 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.441068 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.453420 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.476038 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.485772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.485858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.485876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.485901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.485919 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.494118 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.511278 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.532092 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:10Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.589886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.589934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.589946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.589966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.589979 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.693309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.693395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.693412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.693437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.693462 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.795882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.795940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.795955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.795977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.795992 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.899070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.899134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.899143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.899160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:10 crc kubenswrapper[4911]: I1201 00:08:10.899168 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:10Z","lastTransitionTime":"2025-12-01T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.002384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.002436 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.002446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.002490 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.002501 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.105329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.105422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.105452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.105562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.105594 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.151610 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:11 crc kubenswrapper[4911]: E1201 00:08:11.152257 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.152902 4911 scope.go:117] "RemoveContainer" containerID="bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.193800 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.212640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.212678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.212713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.212733 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.212747 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.232086 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.246763 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.261610 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.276359 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.286556 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.299209 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.315850 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.317897 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.317940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.317951 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.317988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.318000 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.328889 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.340187 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.359551 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.371482 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.388439 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.406607 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.417108 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.420789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.420905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.421010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.421101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.421177 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.428408 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.440298 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.451051 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.523682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.523717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.523730 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.523752 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.523765 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.626775 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.626811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.626820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.626835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.626847 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.730053 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.730112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.730164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.730193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.730210 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.820913 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/1.log" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.824366 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.824804 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.833135 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.833170 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.833181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.833199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.833214 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.837242 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.846893 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.861180 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.875402 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.891374 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.912336 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.927925 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.936225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.936482 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.936603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.936711 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.936792 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:11Z","lastTransitionTime":"2025-12-01T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.949049 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.967218 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.982213 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:11 crc kubenswrapper[4911]: I1201 00:08:11.993144 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:11Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.010806 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.021595 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.039308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.039338 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.039346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.039359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.039369 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.041274 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.064982 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.074529 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.084790 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.096148 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.143449 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.143535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.143550 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.143570 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.143585 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.150850 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.150860 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.150911 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.151385 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.151597 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.151659 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.245913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.245954 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.245963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.245978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.245988 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.338619 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.338805 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.338854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339062 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339198 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:08:44.339076931 +0000 UTC m=+84.477773702 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339284 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339375 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339407 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339331 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:44.339319318 +0000 UTC m=+84.478016079 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.339925 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:44.339902574 +0000 UTC m=+84.478599345 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.348381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.348444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.348489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.348516 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.348536 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.439726 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.440212 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.440290 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.440318 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.440417 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:44.440384195 +0000 UTC m=+84.579081036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.440221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.440759 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.441010 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.441133 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:08:28.441104485 +0000 UTC m=+68.579801296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.441169 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.441350 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:08:44.441305351 +0000 UTC m=+84.580002172 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.451706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.451752 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.451770 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.451794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.451811 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.554591 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.554656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.554678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.554709 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.554731 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.657603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.658504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.658748 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.658978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.659185 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.762821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.762871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.762881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.762897 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.762909 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.832353 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/2.log" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.833940 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/1.log" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.838123 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" exitCode=1 Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.838181 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.838266 4911 scope.go:117] "RemoveContainer" containerID="bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.839519 4911 scope.go:117] "RemoveContainer" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" Dec 01 00:08:12 crc kubenswrapper[4911]: E1201 00:08:12.839867 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.861876 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.867697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.867764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.867781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.867809 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.867827 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.880235 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.898095 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.919261 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.936233 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.955448 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.970754 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.970815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.970833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.970860 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.970878 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:12Z","lastTransitionTime":"2025-12-01T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.983164 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:12 crc kubenswrapper[4911]: I1201 00:08:12.997895 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:12Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.020750 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb02d54adf8ef67dccb2603013abbb65432b7e678ec32307a9dfd9a868dbdfc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:07:54Z\\\",\\\"message\\\":\\\"s.LB{services.LB{Name:\\\\\\\"Service_openshift-console/downloads_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.213\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1201 00:07:53.653913 6346 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:07:53Z is after 2025-08-24T17:21:41Z]\\\\nI1201 00:07:53.653904 6346 model_c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.034123 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.046182 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.058172 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.068757 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.072870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.072904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.072916 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.072936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.072952 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.079798 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.101392 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.115579 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.131891 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.151551 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.151670 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.152409 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.176208 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.176261 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.176277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.176301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.176318 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.279004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.279175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.279241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.279329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.279392 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.382691 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.382757 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.382774 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.382811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.382830 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.484903 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.484933 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.484942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.484956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.484965 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.587840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.588158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.588229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.588506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.588735 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.691175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.691269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.691291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.691329 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.691366 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.794265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.794585 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.794653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.794719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.794774 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.810021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.810065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.810077 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.810094 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.810108 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.823914 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.830621 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.830660 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.830669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.830685 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.830694 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.845277 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/2.log" Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.845997 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.850764 4911 scope.go:117] "RemoveContainer" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.851170 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.855156 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.855198 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.855214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.855237 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.855254 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.877429 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882047 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882327 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.882337 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.894026 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.897611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.897656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.897673 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.897698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.897715 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.901010 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.914814 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: E1201 00:08:13.915606 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.918122 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.918159 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.918169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.918187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.918198 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:13Z","lastTransitionTime":"2025-12-01T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.920765 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.937058 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.950780 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.965775 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.982311 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:13 crc kubenswrapper[4911]: I1201 00:08:13.996609 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:13Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.008346 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020644 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020727 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020761 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020784 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.020989 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.036927 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.056002 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.070977 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.093141 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.110367 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124343 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124421 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124554 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.124574 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.142261 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.151129 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.151323 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.151354 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:14 crc kubenswrapper[4911]: E1201 00:08:14.151494 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:14 crc kubenswrapper[4911]: E1201 00:08:14.151648 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:14 crc kubenswrapper[4911]: E1201 00:08:14.151920 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.171785 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:14Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.227904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.227958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.227979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.228002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.228020 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.331639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.331691 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.331710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.331734 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.331751 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.435316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.435735 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.435883 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.436240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.436388 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.539792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.539910 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.539927 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.539997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.540016 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.643537 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.643605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.643626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.643653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.643675 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.746053 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.746109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.746125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.746149 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.746165 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.850495 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.850546 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.850573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.850670 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.850697 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.954357 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.954412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.954429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.954453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:14 crc kubenswrapper[4911]: I1201 00:08:14.954514 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:14Z","lastTransitionTime":"2025-12-01T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.057783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.057848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.057866 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.057903 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.057922 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.151089 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:15 crc kubenswrapper[4911]: E1201 00:08:15.151305 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.161102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.161142 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.161150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.161167 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.161177 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.264543 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.264614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.264639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.264667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.264687 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.367617 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.367695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.367718 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.367749 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.367774 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.471510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.471598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.471627 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.471662 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.471680 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.574504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.574573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.574621 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.574650 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.574670 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.678658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.678723 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.678741 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.678763 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.678780 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.781895 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.781990 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.782024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.782057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.782080 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.885337 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.885395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.885411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.885434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.885454 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.988688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.988740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.988757 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.988779 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:15 crc kubenswrapper[4911]: I1201 00:08:15.988796 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:15Z","lastTransitionTime":"2025-12-01T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.091144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.091193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.091201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.091213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.091221 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.151650 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.151703 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.151661 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:16 crc kubenswrapper[4911]: E1201 00:08:16.151899 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:16 crc kubenswrapper[4911]: E1201 00:08:16.152009 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:16 crc kubenswrapper[4911]: E1201 00:08:16.152154 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.193786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.193822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.193831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.193845 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.193855 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.296968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.297028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.297047 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.297072 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.297089 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.405261 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.405317 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.405334 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.405360 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.405377 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.508332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.508371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.508383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.508399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.508413 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.611277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.611402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.611422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.611444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.611483 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.713939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.714011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.714029 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.714054 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.714073 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.816623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.816855 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.816909 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.816939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.816957 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.920036 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.920075 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.920087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.920105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:16 crc kubenswrapper[4911]: I1201 00:08:16.920116 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:16Z","lastTransitionTime":"2025-12-01T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.023208 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.023287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.023310 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.023398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.023423 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.126740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.126799 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.126821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.126847 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.126864 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.151270 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:17 crc kubenswrapper[4911]: E1201 00:08:17.151556 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.230416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.230511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.230532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.230558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.230581 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.333410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.333504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.333525 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.333549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.333569 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.436813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.436887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.436915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.436946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.436971 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.538980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.539028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.539042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.539059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.539073 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.642453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.642577 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.642595 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.642618 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.642636 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.745793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.745863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.745887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.745912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.745932 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.849373 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.849560 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.849596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.849622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.849642 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.952829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.952890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.952908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.952938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:17 crc kubenswrapper[4911]: I1201 00:08:17.952961 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:17Z","lastTransitionTime":"2025-12-01T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.056322 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.056389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.056407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.056433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.056485 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.151120 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.151181 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:18 crc kubenswrapper[4911]: E1201 00:08:18.151350 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.151377 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:18 crc kubenswrapper[4911]: E1201 00:08:18.151533 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:18 crc kubenswrapper[4911]: E1201 00:08:18.151653 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.159602 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.159669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.159689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.159715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.159736 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.263976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.264035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.264047 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.264065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.264082 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.368303 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.368693 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.368843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.368984 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.369131 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.471850 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.471891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.471904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.471921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.471933 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.574886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.574947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.574963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.574988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.575008 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.678220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.678294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.678314 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.678340 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.678358 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.781335 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.781393 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.781416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.781446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.781506 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.884820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.884895 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.884913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.884938 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.884961 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.987410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.987719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.987804 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.987921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:18 crc kubenswrapper[4911]: I1201 00:08:18.987990 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:18Z","lastTransitionTime":"2025-12-01T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.091091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.091159 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.091181 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.091211 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.091239 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.151360 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:19 crc kubenswrapper[4911]: E1201 00:08:19.151593 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.193833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.193878 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.193898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.193923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.193942 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.296231 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.296328 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.296347 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.296374 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.296394 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.399794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.399855 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.399879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.399905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.399923 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.503301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.503380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.503405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.503437 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.503455 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.607229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.607291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.607300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.607316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.607326 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.710619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.710677 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.710694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.710718 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.710735 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.813301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.813355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.813375 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.813400 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.813418 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.916453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.916558 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.916584 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.916613 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:19 crc kubenswrapper[4911]: I1201 00:08:19.916630 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:19Z","lastTransitionTime":"2025-12-01T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.020021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.020080 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.020097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.020124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.020142 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.124015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.124079 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.124101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.124131 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.124154 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.152335 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.153014 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.152860 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:20 crc kubenswrapper[4911]: E1201 00:08:20.153745 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:20 crc kubenswrapper[4911]: E1201 00:08:20.154015 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:20 crc kubenswrapper[4911]: E1201 00:08:20.154218 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.186065 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.208799 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.226416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.228639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.228711 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.228737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.228768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.228792 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.245155 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.263756 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.278217 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.292825 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.305519 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.331983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.332257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.332350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.332448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.332602 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.338665 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.353326 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.370028 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.386890 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.406376 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.422580 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.435906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.435930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.435939 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.435953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.435963 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.462183 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.481827 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.496861 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.514682 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:20Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.539861 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.539929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.539949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.540028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.540049 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.643788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.643862 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.643875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.643894 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.643906 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.747670 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.747731 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.747749 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.747774 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.747792 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.856565 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.856646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.856668 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.856717 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.856740 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.961017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.961129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.961547 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.961614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:20 crc kubenswrapper[4911]: I1201 00:08:20.961920 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:20Z","lastTransitionTime":"2025-12-01T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.067102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.067176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.067201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.067234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.067258 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.151198 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:21 crc kubenswrapper[4911]: E1201 00:08:21.151392 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.170604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.170667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.170689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.170718 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.170738 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.273503 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.273580 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.273608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.273639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.273661 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.376335 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.376390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.376410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.376440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.376499 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.479865 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.479929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.479955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.479983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.480004 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.583305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.583380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.583403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.583433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.583491 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.686563 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.686655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.686674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.686704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.686725 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.790807 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.790892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.790903 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.790924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.790936 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.894003 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.894080 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.894101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.894128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.894147 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.996751 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.996829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.996847 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.996875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:21 crc kubenswrapper[4911]: I1201 00:08:21.996894 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:21Z","lastTransitionTime":"2025-12-01T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.100273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.100358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.100383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.100412 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.100434 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.151347 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.151412 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.151347 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:22 crc kubenswrapper[4911]: E1201 00:08:22.151514 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:22 crc kubenswrapper[4911]: E1201 00:08:22.151538 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:22 crc kubenswrapper[4911]: E1201 00:08:22.151714 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.203848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.203908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.203926 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.203974 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.203991 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.308553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.308747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.308768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.308801 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.308825 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.412832 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.412887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.412896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.412914 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.412927 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.516425 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.516610 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.516630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.516655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.516710 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.618876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.618940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.618957 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.618979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.618996 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.722048 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.722093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.722105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.722125 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.722138 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.825543 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.825580 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.825589 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.825604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.825614 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.928341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.928406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.928417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.928440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:22 crc kubenswrapper[4911]: I1201 00:08:22.928482 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:22Z","lastTransitionTime":"2025-12-01T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.031715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.031802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.031826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.031854 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.031873 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.134838 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.134898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.134912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.134934 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.134949 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.151450 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:23 crc kubenswrapper[4911]: E1201 00:08:23.151730 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.238307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.238371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.238391 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.238434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.238509 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.341834 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.341942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.341975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.342118 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.342145 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.444745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.444812 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.444829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.444888 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.444906 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.547236 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.547305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.547324 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.547351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.547370 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.650132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.650243 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.650269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.650297 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.650319 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.753901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.754025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.754049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.754074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.754094 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.856776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.856811 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.856823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.856840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.856852 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.939351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.939406 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.939416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.939432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.939443 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: E1201 00:08:23.955331 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.960019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.960085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.960096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.960116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.960129 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: E1201 00:08:23.976323 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.981073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.981146 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.981165 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.981193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:23 crc kubenswrapper[4911]: I1201 00:08:23.981215 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:23Z","lastTransitionTime":"2025-12-01T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:23 crc kubenswrapper[4911]: E1201 00:08:23.998122 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:23Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.004171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.004225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.004244 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.004271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.004289 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.021779 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.027408 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.027496 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.027511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.027539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.027554 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.046508 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:24Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.046746 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.048989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.049027 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.049044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.049066 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.049079 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.150715 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.150730 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.150924 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.150729 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.151006 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:24 crc kubenswrapper[4911]: E1201 00:08:24.151126 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.152731 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.152769 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.152781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.152798 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.152811 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.255551 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.255626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.255646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.255681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.255706 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.359172 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.359226 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.359239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.359575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.359615 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.463208 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.463266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.463284 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.463307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.463326 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.566243 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.566300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.566316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.566340 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.566356 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.669881 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.669974 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.669991 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.670015 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.670032 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.773023 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.773088 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.773113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.773142 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.773165 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.875765 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.875831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.875856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.875887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.875905 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.979160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.979279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.979300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.979330 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:24 crc kubenswrapper[4911]: I1201 00:08:24.979355 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:24Z","lastTransitionTime":"2025-12-01T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.082359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.082429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.082450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.082541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.082590 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.151079 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:25 crc kubenswrapper[4911]: E1201 00:08:25.151316 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.185701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.185760 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.185780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.185810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.185829 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.290129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.290183 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.290197 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.290219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.290236 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.393161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.393264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.393289 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.393320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.393342 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.495705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.495733 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.495742 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.495757 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.495768 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.598398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.598509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.598536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.598560 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.598577 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.701008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.701039 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.701049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.701065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.701078 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.803395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.803425 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.803436 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.803451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.803486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.905988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.906045 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.906065 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.906091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:25 crc kubenswrapper[4911]: I1201 00:08:25.906109 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:25Z","lastTransitionTime":"2025-12-01T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.008933 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.009011 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.009025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.009050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.009062 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.111973 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.112044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.112060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.112089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.112105 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.151812 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.151825 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.151960 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:26 crc kubenswrapper[4911]: E1201 00:08:26.152140 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:26 crc kubenswrapper[4911]: E1201 00:08:26.152250 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:26 crc kubenswrapper[4911]: E1201 00:08:26.152414 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.215599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.215654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.215664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.215689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.215703 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.318885 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.318942 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.318955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.318982 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.318994 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.422434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.422511 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.422524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.422576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.422609 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.526254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.526304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.526314 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.526333 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.526344 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.628966 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.629049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.629083 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.629104 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.629115 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.731980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.732025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.732040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.732061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.732080 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.834780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.834823 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.834832 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.834852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.834861 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.938042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.938094 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.938108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.938128 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:26 crc kubenswrapper[4911]: I1201 00:08:26.938141 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:26Z","lastTransitionTime":"2025-12-01T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.047158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.047207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.047219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.047248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.047261 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.150937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.150958 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.151002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.151019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.151042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.151062 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: E1201 00:08:27.151185 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.254428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.254516 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.254536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.254563 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.254581 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.358127 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.358210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.358230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.358256 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.358313 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.461231 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.461296 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.461313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.461334 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.461350 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.564535 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.564602 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.564624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.564653 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.564674 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.667545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.667579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.667590 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.667607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.667617 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.770162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.770215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.770234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.770258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.770275 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.874178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.874252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.874273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.874313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.874337 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.977199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.977245 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.977256 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.977273 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:27 crc kubenswrapper[4911]: I1201 00:08:27.977284 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:27Z","lastTransitionTime":"2025-12-01T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.079818 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.079874 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.079886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.079904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.079919 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.159020 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.159110 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.159051 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.159208 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.159346 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.159548 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.160760 4911 scope.go:117] "RemoveContainer" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.161047 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.182940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.182995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.183013 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.183037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.183056 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.285652 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.285715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.285739 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.285773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.285793 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.388565 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.388603 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.388614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.388629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.388640 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.490527 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.490593 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.490611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.490639 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.490656 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.499445 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.499652 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:28 crc kubenswrapper[4911]: E1201 00:08:28.499756 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:09:00.499729137 +0000 UTC m=+100.638425938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.593557 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.593610 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.593623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.593642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.593656 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.696654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.696724 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.696745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.696776 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.696797 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.798822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.798892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.798917 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.798948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.798970 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.901204 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.901241 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.901251 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.901265 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.901275 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:28Z","lastTransitionTime":"2025-12-01T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.908309 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/0.log" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.908358 4911 generic.go:334] "Generic (PLEG): container finished" podID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" containerID="500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca" exitCode=1 Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.908384 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerDied","Data":"500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca"} Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.908710 4911 scope.go:117] "RemoveContainer" containerID="500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.922928 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.940356 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.959057 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.975659 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:28 crc kubenswrapper[4911]: I1201 00:08:28.987226 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:28Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.003186 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.004421 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.004451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.004482 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.004501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.004513 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.021520 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.033930 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.047410 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.073126 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.102958 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.107308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.107355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.107371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.107396 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.107422 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.117029 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.132486 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.147218 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.150657 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:29 crc kubenswrapper[4911]: E1201 00:08:29.150997 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.159930 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.176958 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.194530 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.210822 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.210901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.211317 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.211452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.211655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.211799 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.314758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.314830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.314856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.314890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.314913 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.417534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.417808 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.417869 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.417944 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.418008 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.521864 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.521973 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.521986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.522010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.522026 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.626002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.626302 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.626489 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.626625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.626726 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.730685 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.731022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.731159 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.731285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.731396 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.834955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.835186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.835326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.835515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.835664 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.913746 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/0.log" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.913833 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerStarted","Data":"44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.932255 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.938655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.938720 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.938737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.938768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.938788 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:29Z","lastTransitionTime":"2025-12-01T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.946836 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.972684 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.985527 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:29 crc kubenswrapper[4911]: I1201 00:08:29.997584 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:29Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.010594 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.024169 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.035225 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.041806 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.041835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.041844 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.041858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.041870 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.051952 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.064450 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.081639 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.095984 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.112010 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.131751 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.144675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.145049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.145344 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.145608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.145851 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.149345 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.151722 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.151749 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:30 crc kubenswrapper[4911]: E1201 00:08:30.151893 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:30 crc kubenswrapper[4911]: E1201 00:08:30.152013 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.153747 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:30 crc kubenswrapper[4911]: E1201 00:08:30.154056 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.173284 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.188917 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.203740 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.226730 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.239735 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.248859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.248924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.248945 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.248975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.248997 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.258681 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.277293 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.289988 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.305942 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.323358 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.338848 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.351767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.351809 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.351820 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.351841 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.351854 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.353115 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.365891 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.383029 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.399228 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.414945 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.442575 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.454957 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.454989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.455001 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.455018 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.455032 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.458960 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.471532 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.486666 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.506960 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:30Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.558056 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.558150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.558161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.558178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.558189 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.660746 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.660800 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.660810 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.660826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.660838 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.762716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.762741 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.762749 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.762763 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.762772 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.865549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.865586 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.865597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.865612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.865621 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.968372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.968419 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.968431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.968448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:30 crc kubenswrapper[4911]: I1201 00:08:30.968480 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:30Z","lastTransitionTime":"2025-12-01T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.071372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.071415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.071424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.071441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.071469 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.151435 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:31 crc kubenswrapper[4911]: E1201 00:08:31.151596 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.174247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.174291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.174302 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.174318 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.174330 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.277442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.277539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.277563 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.277587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.277605 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.380349 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.380392 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.380400 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.380414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.380424 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.483380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.483452 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.483506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.483521 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.483531 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.586725 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.586772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.586783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.586799 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.586810 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.689736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.689782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.689795 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.689816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.689829 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.792908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.792963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.792978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.792995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.793006 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.896931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.896977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.896987 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.897006 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:31 crc kubenswrapper[4911]: I1201 00:08:31.897015 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:31Z","lastTransitionTime":"2025-12-01T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.000095 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.000151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.000164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.000186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.000198 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.102952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.103005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.103017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.103034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.103048 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.151583 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.151684 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:32 crc kubenswrapper[4911]: E1201 00:08:32.151759 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.151850 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:32 crc kubenswrapper[4911]: E1201 00:08:32.152005 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:32 crc kubenswrapper[4911]: E1201 00:08:32.152141 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.206556 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.206608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.206616 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.206632 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.206643 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.309446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.309504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.309515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.309530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.309543 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.412567 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.412615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.412627 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.412646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.412661 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.514738 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.514782 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.514801 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.514826 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.514844 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.617361 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.617434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.617451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.617513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.617534 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.719815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.719893 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.719909 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.719931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.719950 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.823184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.823279 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.823300 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.823330 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.823352 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.925838 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.925960 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.926000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.926035 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:32 crc kubenswrapper[4911]: I1201 00:08:32.926059 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:32Z","lastTransitionTime":"2025-12-01T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.029163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.029224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.029263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.029288 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.029301 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.132682 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.132913 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.133068 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.133205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.133287 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.150709 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:33 crc kubenswrapper[4911]: E1201 00:08:33.150857 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.236007 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.236649 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.236706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.236742 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.236767 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.340112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.340169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.340186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.340225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.340243 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.443058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.443120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.443140 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.443166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.443185 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.545837 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.545887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.545905 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.545929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.545945 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.648740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.648805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.648822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.648847 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.648864 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.751247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.751297 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.751319 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.751347 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.751368 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.854184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.854611 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.854755 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.854886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.855009 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.958228 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.958576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.958700 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.958952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:33 crc kubenswrapper[4911]: I1201 00:08:33.959091 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:33Z","lastTransitionTime":"2025-12-01T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.061397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.061476 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.061493 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.061510 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.061522 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.151419 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.151498 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.151521 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.152275 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.152328 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.152687 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.164105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.164154 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.164175 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.164201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.164221 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.230584 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.230640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.230658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.230683 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.230701 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.251739 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:34Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.256342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.256407 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.256418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.256430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.256469 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.277768 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:34Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.282594 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.282696 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.282775 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.282843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.282906 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.302116 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:34Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.306851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.306882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.306891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.306903 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.306914 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.326168 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:34Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.330581 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.330786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.330925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.331058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.331186 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.352191 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:34Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:34 crc kubenswrapper[4911]: E1201 00:08:34.352328 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.354667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.354703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.354716 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.354738 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.354757 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.458043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.458093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.458109 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.458133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.458150 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.560700 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.560752 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.560771 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.560796 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.560814 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.663929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.664514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.664678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.664815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.664955 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.767531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.767587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.767608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.767634 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.767653 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.870383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.870697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.870846 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.870995 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.871136 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.974578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.974651 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.974671 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.974698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:34 crc kubenswrapper[4911]: I1201 00:08:34.974717 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:34Z","lastTransitionTime":"2025-12-01T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.078111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.078514 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.078686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.078986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.079123 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.150875 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:35 crc kubenswrapper[4911]: E1201 00:08:35.151030 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.182851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.182912 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.182929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.182955 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.182976 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.285792 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.285851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.285867 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.285892 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.285914 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.388356 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.388426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.388445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.388512 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.388531 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.491186 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.491240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.491258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.491281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.491302 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.594046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.594098 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.594112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.594132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.594147 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.697292 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.697364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.697389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.697418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.697443 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.800908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.800974 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.800996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.801019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.801034 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.904139 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.904216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.904240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.904271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:35 crc kubenswrapper[4911]: I1201 00:08:35.904294 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:35Z","lastTransitionTime":"2025-12-01T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.007754 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.007824 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.007844 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.007870 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.007890 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.111137 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.111189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.111207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.111229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.111246 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.151680 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:36 crc kubenswrapper[4911]: E1201 00:08:36.151843 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.152226 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:36 crc kubenswrapper[4911]: E1201 00:08:36.152338 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.152717 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:36 crc kubenswrapper[4911]: E1201 00:08:36.152859 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.213930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.213983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.214000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.214021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.214038 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.317152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.317215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.317239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.317270 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.317290 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.420234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.420305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.420324 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.420351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.420371 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.524151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.524214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.524235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.524264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.524284 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.626970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.627016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.627028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.627069 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.627083 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.730989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.731040 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.731097 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.731126 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.731143 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.835070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.835215 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.835236 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.835262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.835280 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.938625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.938679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.938698 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.938722 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:36 crc kubenswrapper[4911]: I1201 00:08:36.938743 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:36Z","lastTransitionTime":"2025-12-01T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.042000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.042070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.042089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.042115 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.042132 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.144937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.144977 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.144988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.145002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.145014 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.151237 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:37 crc kubenswrapper[4911]: E1201 00:08:37.151371 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.248583 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.248645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.248661 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.248690 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.248707 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.351372 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.351430 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.351446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.351474 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.351516 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.455635 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.455707 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.455727 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.455753 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.455772 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.558976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.559058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.559082 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.559114 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.559134 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.662365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.662426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.662445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.662497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.662519 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.765411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.765520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.765539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.765564 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.765581 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.867932 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.867993 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.868016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.868044 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.868066 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.970678 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.970726 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.970744 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.970767 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:37 crc kubenswrapper[4911]: I1201 00:08:37.970784 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:37Z","lastTransitionTime":"2025-12-01T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.074009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.074048 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.074063 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.074083 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.074097 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.151108 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.151156 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.151121 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:38 crc kubenswrapper[4911]: E1201 00:08:38.151316 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:38 crc kubenswrapper[4911]: E1201 00:08:38.151527 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:38 crc kubenswrapper[4911]: E1201 00:08:38.151640 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.176981 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.177027 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.177053 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.177094 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.177117 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.279970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.280034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.280054 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.280079 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.280096 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.383358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.383445 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.383534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.383583 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.383603 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.486411 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.486497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.486515 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.486540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.486587 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.590164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.590253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.590271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.590293 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.590310 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.693055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.693124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.693141 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.693164 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.693182 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.796055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.796126 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.796144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.796171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.796188 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.900122 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.900212 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.900236 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.900269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:38 crc kubenswrapper[4911]: I1201 00:08:38.900293 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:38Z","lastTransitionTime":"2025-12-01T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.002418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.002519 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.002624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.002710 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.002780 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.106816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.106864 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.106875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.106893 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.106906 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.151846 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:39 crc kubenswrapper[4911]: E1201 00:08:39.152005 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.209549 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.209596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.209610 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.209629 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.209642 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.313180 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.313254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.313275 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.313798 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.313858 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.417142 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.417205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.417227 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.417253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.417272 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.519245 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.519304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.519321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.519344 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.519360 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.622877 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.622936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.622958 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.622986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.623006 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.726289 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.726354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.726371 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.726398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.726417 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.829108 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.829156 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.829167 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.829187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.829200 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.932133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.932207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.932225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.932252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:39 crc kubenswrapper[4911]: I1201 00:08:39.932267 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:39Z","lastTransitionTime":"2025-12-01T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.034732 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.034783 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.034796 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.034822 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.034836 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.137282 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.137335 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.137355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.137378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.137395 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.150918 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.151007 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:40 crc kubenswrapper[4911]: E1201 00:08:40.151134 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.151173 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:40 crc kubenswrapper[4911]: E1201 00:08:40.151338 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:40 crc kubenswrapper[4911]: E1201 00:08:40.151739 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.174077 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.193779 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.210646 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.227759 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.240187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.240244 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.240264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.240293 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.240312 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.249051 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.269770 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.285692 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.307440 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.324715 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.342574 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.345038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.345286 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.345307 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.345332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.345381 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.362101 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.383081 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.413304 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.445331 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.448390 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.448449 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.448509 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.448546 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.448569 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.464643 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.490171 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.510926 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.527396 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:40Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.550879 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.550940 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.550965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.550998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.551020 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.654148 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.654180 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.654189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.654206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.654216 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.757050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.757132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.757150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.757178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.757197 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.860572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.860625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.860645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.860670 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.860689 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.963736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.963791 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.963813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.963843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:40 crc kubenswrapper[4911]: I1201 00:08:40.963865 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:40Z","lastTransitionTime":"2025-12-01T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.066637 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.066697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.066715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.066741 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.066760 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.150951 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:41 crc kubenswrapper[4911]: E1201 00:08:41.151331 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.152531 4911 scope.go:117] "RemoveContainer" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.169998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.170056 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.170080 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.170113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.170140 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.274216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.274264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.274283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.274308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.274325 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.377022 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.377074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.377089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.377113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.377134 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.480283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.480349 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.480369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.480394 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.480411 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.584071 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.584130 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.584147 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.584172 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.584188 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.686737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.686802 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.686821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.686848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.686868 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.791087 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.791148 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.791168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.791199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.791219 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.901110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.901177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.901195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.901219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:41 crc kubenswrapper[4911]: I1201 00:08:41.901237 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:41Z","lastTransitionTime":"2025-12-01T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.004249 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.005070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.005274 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.005567 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.005818 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.120548 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.120923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.121030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.121150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.121248 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.150930 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.151080 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.150932 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:42 crc kubenswrapper[4911]: E1201 00:08:42.151367 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:42 crc kubenswrapper[4911]: E1201 00:08:42.151537 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:42 crc kubenswrapper[4911]: E1201 00:08:42.151700 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.225010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.225814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.225932 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.226034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.226113 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.330199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.330262 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.330283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.330313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.330334 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.433504 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.433601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.433623 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.433649 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.433665 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.536738 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.536840 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.536896 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.536996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.537091 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.640121 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.640202 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.640220 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.640246 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.640265 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.742968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.743019 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.743037 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.743061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.743079 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.826000 4911 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.722427114s: [/var/lib/containers/storage/overlay/e73ac1f3f9fd4cca879de1586df000aee9d0ef2b3fcb8b49d98631e3334dfb1f/diff /var/log/pods/openshift-network-operator_network-operator-58b4c7f79c-55gtf_37a5e44f-9a88-4405-be8a-b645485e7312/network-operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.826157 4911 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.855405161s: [/var/lib/containers/storage/overlay/ab1a327b0d6c22b2c58a1cf1784e5cb7b52a0f520949f71fdef753a4e4767520/diff /var/log/pods/openshift-network-node-identity_network-node-identity-vrzqb_ef543e1b-8068-4ea3-b32a-61027b32e95d/webhook/0.log]; will not log again for this container unless duration exceeds 2s Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.827594 4911 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.709454861s: [/var/lib/containers/storage/overlay/4a150b5bc2a25048878d5e17f0ba0c149fba59069589d580c9385e8b9c3181dc/diff /var/log/pods/openshift-network-node-identity_network-node-identity-vrzqb_ef543e1b-8068-4ea3-b32a-61027b32e95d/approver/0.log]; will not log again for this container unless duration exceeds 2s Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.845598 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.845850 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.845931 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.846024 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.846106 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.948901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.948949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.948964 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.948985 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.949000 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:42Z","lastTransitionTime":"2025-12-01T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:42 crc kubenswrapper[4911]: I1201 00:08:42.977293 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/2.log" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.050996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.051028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.051042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.051060 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.051070 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.151616 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:43 crc kubenswrapper[4911]: E1201 00:08:43.151888 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.153427 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.153492 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.153505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.153532 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.153547 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.164855 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.256703 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.256757 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.256773 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.256797 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.256815 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.362346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.362405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.362424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.362450 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.362503 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.465622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.465975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.466178 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.466398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.466556 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.569160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.569214 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.569230 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.569259 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.569276 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.676285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.676374 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.676396 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.676431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.676486 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.778785 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.778851 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.778868 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.778888 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.778905 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.880990 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.881615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.881688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.881766 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.881837 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.984363 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.984538 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.984609 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.984638 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.984656 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:43Z","lastTransitionTime":"2025-12-01T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.986233 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/2.log" Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.990401 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:08:43 crc kubenswrapper[4911]: I1201 00:08:43.991094 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.014041 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.042701 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.062317 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.085723 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.087533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.087591 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.087615 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.087656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.087684 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.101864 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.118262 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.141764 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.151201 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.151280 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.151389 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.151584 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.151647 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.151728 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.167965 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.187829 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.195179 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.195240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.195254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.195276 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.195295 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.205094 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288bdbc5-c261-4a64-b00b-513367c86b40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb1e7d4fb61774dd975be9022a9fc49669ba3d40607f3b5b14981ce21558f790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.226545 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.246793 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.280069 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.299155 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.299235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.299253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.299285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.299311 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.301373 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.320824 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.343505 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.360036 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.360093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.360103 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.360124 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.360143 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.361663 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.374833 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.376785 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380253 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380529 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380585 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380737 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.380752 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.380776 4911 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.380856 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:48.38083377 +0000 UTC m=+148.519530571 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.380965 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.381014 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.381034 4911 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.380979 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:48.380950644 +0000 UTC m=+148.519647425 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.381113 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:48.381098338 +0000 UTC m=+148.519795139 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.398607 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.404059 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.404120 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.404138 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.404169 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.404187 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.414727 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.425034 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.430534 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.430585 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.430597 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.430618 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.430634 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.447543 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.453353 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.453447 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.453501 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.453531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.453551 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.471235 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:44Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.471622 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.474416 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.474566 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.474594 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.474619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.474641 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.482423 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.482638 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482699 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482746 4911 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482747 4911 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482774 4911 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482858 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:48.482829389 +0000 UTC m=+148.621526190 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 00:08:44 crc kubenswrapper[4911]: E1201 00:08:44.482893 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:48.482872091 +0000 UTC m=+148.621568902 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.578403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.578526 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.578544 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.578577 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.578599 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.681432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.681561 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.681587 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.681626 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.681648 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.785145 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.785221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.785246 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.785281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.785306 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.888935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.889017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.889039 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.889070 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.889097 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.992144 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.992210 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.992229 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.992254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.992274 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:44Z","lastTransitionTime":"2025-12-01T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.996349 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/3.log" Dec 01 00:08:44 crc kubenswrapper[4911]: I1201 00:08:44.997509 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/2.log" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.001593 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" exitCode=1 Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.001663 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.001733 4911 scope.go:117] "RemoveContainer" containerID="1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.002898 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:08:45 crc kubenswrapper[4911]: E1201 00:08:45.003400 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.019763 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.044405 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.069416 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.095443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.095542 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.095582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.095609 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.095631 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.108335 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1095e648538bebd05b94e5c8dfa90fa40e0acaf2247a9146ac10d181e71d48e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:12Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 00:08:12.009882 6542 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 00:08:12.009911 6542 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 00:08:12.009920 6542 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 00:08:12.009944 6542 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 00:08:12.009957 6542 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 00:08:12.009965 6542 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 00:08:12.010018 6542 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 00:08:12.010045 6542 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 00:08:12.010067 6542 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 00:08:12.010074 6542 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 00:08:12.010100 6542 factory.go:656] Stopping watch factory\\\\nI1201 00:08:12.010114 6542 ovnkube.go:599] Stopped ovnkube\\\\nI1201 00:08:12.010140 6542 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 00:08:12.010150 6542 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 00:08:12.010157 6542 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"ttempt(s)\\\\nI1201 00:08:43.896451 6979 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 00:08:43.896492 6979 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1201 00:08:43.896494 6979 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ptrhz\\\\nF1201 00:08:43.896495 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.129129 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.151246 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:45 crc kubenswrapper[4911]: E1201 00:08:45.151546 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.165877 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.185919 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.199188 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.199226 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.199239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.199258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.199272 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.205369 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.226422 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.246409 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.266981 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.281948 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.302467 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.302560 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.302578 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.302606 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.302625 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.307574 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.330418 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.350989 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.368306 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288bdbc5-c261-4a64-b00b-513367c86b40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb1e7d4fb61774dd975be9022a9fc49669ba3d40607f3b5b14981ce21558f790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.390428 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.406020 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.406112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.406131 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.406156 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.406175 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.413280 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.429131 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:45Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.509793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.510005 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.510025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.510051 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.510069 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.612927 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.612979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.612994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.613017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.613034 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.716586 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.716657 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.716676 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.716706 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.716726 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.819689 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.819747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.819768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.819793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.819809 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.924238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.924323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.924350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.924384 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:45 crc kubenswrapper[4911]: I1201 00:08:45.924410 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:45Z","lastTransitionTime":"2025-12-01T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.009769 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/3.log" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.018367 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:08:46 crc kubenswrapper[4911]: E1201 00:08:46.019403 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.027800 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.027856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.027923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.027947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.027964 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.048731 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"ttempt(s)\\\\nI1201 00:08:43.896451 6979 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 00:08:43.896492 6979 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1201 00:08:43.896494 6979 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ptrhz\\\\nF1201 00:08:43.896495 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.067830 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.088356 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.110282 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131134 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131188 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131207 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.131586 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.149347 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.151788 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.151851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.151937 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:46 crc kubenswrapper[4911]: E1201 00:08:46.152141 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:46 crc kubenswrapper[4911]: E1201 00:08:46.152320 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:46 crc kubenswrapper[4911]: E1201 00:08:46.152532 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.189365 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.211166 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.231220 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.234538 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.234594 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.234612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.234642 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.234666 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.251854 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.271536 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.286996 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.303299 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.324253 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.337448 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.337552 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.337573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.337599 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.337618 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.345359 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.368262 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.385887 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288bdbc5-c261-4a64-b00b-513367c86b40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb1e7d4fb61774dd975be9022a9fc49669ba3d40607f3b5b14981ce21558f790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.407822 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.430290 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:46Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.439784 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.439854 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.439898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.439921 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.439939 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.543285 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.543355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.543368 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.543401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.543421 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.646713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.646756 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.646768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.646788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.646805 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.749967 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.750085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.750111 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.750143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.750167 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.854021 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.854096 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.854129 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.854162 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.854187 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.957965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.958041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.958061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.958091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:46 crc kubenswrapper[4911]: I1201 00:08:46.958117 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:46Z","lastTransitionTime":"2025-12-01T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.060945 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.061052 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.061071 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.061092 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.061105 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.150941 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:47 crc kubenswrapper[4911]: E1201 00:08:47.151198 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.165263 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.165320 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.165340 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.165365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.165385 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.268443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.268544 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.268562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.268588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.268608 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.372150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.372228 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.372250 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.372284 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.372306 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.475915 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.475994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.476016 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.476050 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.476074 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.579382 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.579923 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.580133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.580316 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.580567 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.683711 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.683770 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.683817 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.683843 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.683860 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.787014 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.787081 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.787093 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.787116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.787130 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.890681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.890780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.890805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.890849 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.890879 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.994780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.994875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.994898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.994933 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:47 crc kubenswrapper[4911]: I1201 00:08:47.994962 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:47Z","lastTransitionTime":"2025-12-01T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.099218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.099281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.099291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.099321 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.099335 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.151597 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.151716 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:48 crc kubenswrapper[4911]: E1201 00:08:48.152023 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.152149 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:48 crc kubenswrapper[4911]: E1201 00:08:48.152527 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:48 crc kubenswrapper[4911]: E1201 00:08:48.152635 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.202444 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.202572 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.202596 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.202630 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.202653 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.306414 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.306516 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.306541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.306569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.306590 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.412378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.412453 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.412503 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.412529 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.412546 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.516038 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.516110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.516132 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.516158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.516176 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.618998 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.619074 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.619091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.619116 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.619135 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.722721 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.722788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.722803 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.722830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.722849 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.826006 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.826066 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.826085 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.826113 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.826133 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.929991 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.930054 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.930112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.930145 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:48 crc kubenswrapper[4911]: I1201 00:08:48.930166 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:48Z","lastTransitionTime":"2025-12-01T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.033166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.033219 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.033237 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.033260 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.033278 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.136928 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.137002 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.137025 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.137055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.137077 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.151674 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:49 crc kubenswrapper[4911]: E1201 00:08:49.151949 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.240922 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.240988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.241008 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.241034 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.241052 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.344493 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.344560 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.344582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.344607 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.344623 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.447835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.447890 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.447909 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.447935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.447953 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.551507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.551571 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.551588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.551614 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.551635 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.655664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.655736 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.655755 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.655781 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.655806 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.760364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.760443 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.760497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.760533 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.760557 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.864624 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.864686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.864704 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.864730 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.864748 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.967842 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.967906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.967924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.967948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:49 crc kubenswrapper[4911]: I1201 00:08:49.967966 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:49Z","lastTransitionTime":"2025-12-01T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.071292 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.071364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.071389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.071418 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.071442 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.151262 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.151401 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.151401 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:50 crc kubenswrapper[4911]: E1201 00:08:50.151614 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:50 crc kubenswrapper[4911]: E1201 00:08:50.151779 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:50 crc kubenswrapper[4911]: E1201 00:08:50.152406 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.173727 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426ef8a35c5350bc7a6f2d5a5de453a9a37037acd030c77338513d0b6c5435\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.175742 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.175816 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.175859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.175889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.175912 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.193801 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h54fr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:28Z\\\",\\\"message\\\":\\\"2025-12-01T00:07:42+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f\\\\n2025-12-01T00:07:42+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_491ab7cb-afe7-4828-8d11-ab056a64cf4f to /host/opt/cni/bin/\\\\n2025-12-01T00:07:43Z [verbose] multus-daemon started\\\\n2025-12-01T00:07:43Z [verbose] Readiness Indicator file check\\\\n2025-12-01T00:08:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5758q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h54fr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.218689 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8af6f05-3ccd-4b80-b144-530b83bfdc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T00:08:44Z\\\",\\\"message\\\":\\\"ttempt(s)\\\\nI1201 00:08:43.896451 6979 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 00:08:43.896492 6979 default_network_controller.go:776] Recording success event on pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI1201 00:08:43.896494 6979 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ptrhz\\\\nF1201 00:08:43.896495 6979 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:08:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trgxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ptrhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.238107 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7186ce3-3813-4ee7-9746-fb06e2f997e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d436985f9669f6f24721618882ef65fca25b72b663521e0cc255c74830ee15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd759f5a51c587d8e98d02f962a97b57fa88e61841e5d0bda649221e944464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe26a7edae12ebb11b2b66112f651176257375db436debc57a6c2de6b5ba0033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://910a6d583a2d968a0e860acc8de97011f001a01b2618984ed24a757875204205\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.255542 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38822279-c87b-4f66-986f-74be25568b61\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86518824f8ac38abb9c1a3328b004b45b5f14356bc442ddd7c53f43c29f63c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2afef566d35c62338dd078176dfce9d54ac5475e65bd3b078382bd1c239dbaf7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e97d29d030a3604c9232acbeb77b8b6fa4bc696d5e5d3d47ee70318f461dc89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.276366 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.277895 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.277947 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.277963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.277989 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.278008 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.291441 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.305220 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8ml8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68489275-7ca7-441e-9591-bf6993da0b1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9729d70079d67b13788f2db7b6cce5ff18fe9a8ac14fd11e25c67d47f1ba9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j8x9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8ml8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.339646 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f0c714-b255-41c1-bc7c-c43101de446d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5d475a9636c9132209d75ad9d29d5b545ea9017f142155ef6fbb424d33a1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5039a2c63b938038ee594ed0f5ad2a8ba3d6ae742c722cb152a150011b7ffe73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf6e1b1ca0113ca18f167e58f1ee0b5fb04900ce9aca5e6f56a929d71d02345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554a818665e091d2850f32f9dad6639ba095e9d12d341fbbfc6179a578dc05a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bfd88cae6fd6bacb29017b3fcef5bc7832526dfe0a3d960c8d27d76517a0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42feecdaf3082f32ae8c7da675b08b36d86329553b4a02a36a7441a633779e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6973fcf086af295b1e8d0f5925d789d4b949e00a221159f9a75d6da2a64f334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6081acd07ce90915f9060367a70922df0480eaf10271efe4245fd8c82d6a7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.358424 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.373167 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10941e4a-3eac-4ef3-a814-c83adcea347e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w9mz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bzs4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.380904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.380950 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.380961 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.380980 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.380996 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.391571 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a597e3cbbe416cb34bcf0b7f6b8178883ee9d5808f4d17bae026849d7b5ae3bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.410414 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca553967-361c-45e2-9f78-15e5bedc7ea6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T00:07:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 00:07:33.822200 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 00:07:33.823407 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1504054846/tls.crt::/tmp/serving-cert-1504054846/tls.key\\\\\\\"\\\\nI1201 00:07:39.480067 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 00:07:39.483794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 00:07:39.483831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 00:07:39.483887 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 00:07:39.483902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 00:07:39.492982 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 00:07:39.493027 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493040 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 00:07:39.493051 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 00:07:39.493057 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 00:07:39.493063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 00:07:39.493069 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 00:07:39.493315 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 00:07:39.498390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.430183 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc12dab7dd4def86e939566331c4700f6354a2182b08fde6509c555330835481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70d2e0129ff2b819d9ffca391246de80738d4f92b2aef1129ccd701cee13f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.446974 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7lz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c8dbb7c-c86c-4fd7-8dbe-5ef321480b40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1ff5bec572c1c74a373a720dd8b9946592ab424759ca923f7cff10fdb49270f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8qqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7lz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.472639 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7e63b3d-a855-4971-8a5a-995fad727bb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846ccac58c9eb0a5f9690d5ef20ae9bc9446b1bb6bc068745531d604e0e0e539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50778c411d85e0b92e60a15ae746b347807cb2e52657bbdc9538ddd30f86ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223fc131c8e5b6b73b0b36ced1e89c62df1faeaecea78fe4e226c05e82ed74fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6630fcf3c375893a99cc55d9aaa3243dfc30214991805b61c300308479de3a28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36638d71073d11fde881d3f0ab3138cb8d6ab1e3bc693b9669af83ea1ea05736\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4284cadcac3ff745368e85dc964cc92e61fdc41bc7935efa3c455bc847221298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac27e1e95be22047f3bb4ef38e46b07ab7f7a5f52fb76ddda9686498788a3092\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn52n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hmfxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.483975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.484029 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.484046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.484071 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.484088 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.490831 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470f170b-eeab-4f43-bd48-18e50771289a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a008c6f8175a48092e567f4cee841e07bdcb579f9d2b754315e050184642d1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jx4bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp4w9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.507931 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2659d33d-d79b-4e62-845f-f1538638f390\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ef8d2fb8615c44d6da341e40f4933f7f80ac33a7d5ff7fa2a94a55f9f9cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd834a1c0c5478cb6bffa47cfb7b0b009167871dd75085ed6c69e21c92ce9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7xcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2rl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.523859 4911 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"288bdbc5-c261-4a64-b00b-513367c86b40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb1e7d4fb61774dd975be9022a9fc49669ba3d40607f3b5b14981ce21558f790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627fc3913815ccd167cc7840e9a5eacf041f1dd09886938c881686fd39e0f377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T00:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T00:07:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:50Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.587152 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.587203 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.587216 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.587235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.587247 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.690368 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.690429 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.690446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.690502 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.690523 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.793069 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.793153 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.793176 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.793204 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.793225 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.896513 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.896573 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.896595 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.896622 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.896640 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.999277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.999355 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.999378 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.999409 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:50 crc kubenswrapper[4911]: I1201 00:08:50.999430 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:50Z","lastTransitionTime":"2025-12-01T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.102720 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.102824 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.103257 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.104073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.104122 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.151127 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:51 crc kubenswrapper[4911]: E1201 00:08:51.151337 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.206886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.206937 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.206956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.206978 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.206997 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.309342 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.309403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.309426 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.309498 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.309517 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.412600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.412669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.412686 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.412713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.412733 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.516174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.516237 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.516253 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.516276 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.516294 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.618999 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.619032 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.619041 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.619057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.619067 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.722545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.722628 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.722646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.722672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.722690 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.826330 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.826389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.826410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.826435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.826452 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.929780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.929848 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.929873 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.929904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:51 crc kubenswrapper[4911]: I1201 00:08:51.929928 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:51Z","lastTransitionTime":"2025-12-01T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.033264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.033672 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.033849 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.034055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.034190 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.137634 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.137705 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.137723 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.137747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.137764 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.150968 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.151024 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:52 crc kubenswrapper[4911]: E1201 00:08:52.151315 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:52 crc kubenswrapper[4911]: E1201 00:08:52.151901 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.152227 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:52 crc kubenswrapper[4911]: E1201 00:08:52.152406 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.240835 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.240891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.240908 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.240933 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.240949 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.344323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.344382 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.344402 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.344428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.344446 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.447226 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.447269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.447281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.447298 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.447309 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.550636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.550697 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.550715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.550740 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.550760 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.654287 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.654334 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.654346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.654362 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.654375 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.757669 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.757829 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.757845 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.757868 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.757881 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.861405 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.861505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.861525 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.861553 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.861572 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.965102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.965174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.965192 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.965218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:52 crc kubenswrapper[4911]: I1201 00:08:52.965237 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:52Z","lastTransitionTime":"2025-12-01T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.068101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.068160 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.068177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.068206 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.068224 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.151021 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:53 crc kubenswrapper[4911]: E1201 00:08:53.151212 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.170902 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.170952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.170961 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.170973 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.170983 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.274225 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.274291 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.274313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.274341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.274364 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.378148 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.378238 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.378278 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.378313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.378339 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.482061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.482133 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.482155 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.482185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.482213 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.584868 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.584925 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.584944 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.584968 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.584983 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.688305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.688343 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.688352 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.688365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.688397 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.792451 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.792576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.792604 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.792641 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.792679 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.895871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.895951 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.895976 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.896007 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.896028 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.998588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.998636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.998645 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.998719 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:53 crc kubenswrapper[4911]: I1201 00:08:53.998733 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:53Z","lastTransitionTime":"2025-12-01T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.101807 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.101859 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.101876 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.101898 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.101919 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.151188 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.151227 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.151230 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.152059 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.152112 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.152092 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.204357 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.204440 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.204506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.204540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.204563 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.307663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.307715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.307732 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.307758 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.307791 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.410874 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.410949 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.410972 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.411001 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.411024 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.514904 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.515203 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.515365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.515569 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.515721 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.618687 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.619033 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.619173 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.619312 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.619439 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.722889 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.722932 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.722943 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.722957 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.722967 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.809493 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.809979 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.810150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.810350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.810549 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.830355 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.835761 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.835830 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.835855 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.835886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.835911 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.865495 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.872201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.872288 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.872314 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.872349 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.872379 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.893198 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.898982 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.899030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.899046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.899069 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.899088 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.919598 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.925346 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.925415 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.925436 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.925495 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.925515 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.949003 4911 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T00:08:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b4d95f07-110d-43d3-9dda-782c8849ca6a\\\",\\\"systemUUID\\\":\\\"fe489437-a045-4085-a506-8b5514dd1af7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T00:08:54Z is after 2025-08-24T17:21:41Z" Dec 01 00:08:54 crc kubenswrapper[4911]: E1201 00:08:54.949629 4911 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.951936 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.952061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.952150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.952254 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:54 crc kubenswrapper[4911]: I1201 00:08:54.952346 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:54Z","lastTransitionTime":"2025-12-01T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.054941 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.055010 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.055030 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.055055 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.055073 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.151281 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:55 crc kubenswrapper[4911]: E1201 00:08:55.151521 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.158654 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.158718 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.158735 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.158760 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.158778 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.262827 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.262875 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.262887 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.262906 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.262921 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.365358 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.365424 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.365441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.365507 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.365527 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.468918 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.468986 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.469004 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.469028 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.469045 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.571886 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.571948 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.571971 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.571997 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.572014 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.675929 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.676003 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.676027 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.676058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.676079 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.779674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.779746 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.779764 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.779789 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.779809 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.882399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.882455 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.882505 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.882528 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.882545 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.986046 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.986119 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.986136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.986158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:55 crc kubenswrapper[4911]: I1201 00:08:55.986175 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:55Z","lastTransitionTime":"2025-12-01T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.089884 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.090413 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.090433 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.090488 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.090511 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.151741 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.151812 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.151847 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:56 crc kubenswrapper[4911]: E1201 00:08:56.151967 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:56 crc kubenswrapper[4911]: E1201 00:08:56.152063 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:56 crc kubenswrapper[4911]: E1201 00:08:56.152153 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.193539 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.193655 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.193679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.193708 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.193729 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.297308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.297345 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.297357 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.297374 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.297386 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.400224 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.400308 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.400332 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.400364 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.400384 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.503397 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.503497 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.503548 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.503575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.503593 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.607145 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.607213 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.607235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.607266 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.607286 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.710268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.710334 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.710354 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.710381 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.710398 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.813101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.813171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.813183 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.813205 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.813218 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.916901 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.916982 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.917006 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.917042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:56 crc kubenswrapper[4911]: I1201 00:08:56.917065 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:56Z","lastTransitionTime":"2025-12-01T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.020652 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.020712 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.020731 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.020755 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.020776 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.124208 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.124337 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.124363 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.124399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.124424 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.151737 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:57 crc kubenswrapper[4911]: E1201 00:08:57.151966 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.228313 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.228373 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.228389 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.228417 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.228436 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.331582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.331640 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.331658 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.331683 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.331707 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.434434 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.434579 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.434600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.434625 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.434643 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.537420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.537538 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.537559 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.537588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.537607 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.640441 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.640536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.640555 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.640580 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.640599 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.744305 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.744385 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.744403 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.744431 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.744450 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.848043 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.848247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.848271 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.848306 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.848330 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.951965 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.952442 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.952636 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.952699 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:57 crc kubenswrapper[4911]: I1201 00:08:57.952723 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:57Z","lastTransitionTime":"2025-12-01T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.056304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.056369 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.056383 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.056410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.056431 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.151246 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.151280 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:08:58 crc kubenswrapper[4911]: E1201 00:08:58.151410 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:08:58 crc kubenswrapper[4911]: E1201 00:08:58.151546 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.152086 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:08:58 crc kubenswrapper[4911]: E1201 00:08:58.152922 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.159052 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.159118 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.159143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.159174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.159198 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.261953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.261988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.262000 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.262017 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.262030 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.364861 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.364930 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.364953 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.364983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.365004 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.468605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.468677 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.468701 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.468732 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.468757 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.571734 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.571778 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.571788 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.571805 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.571816 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.675341 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.675410 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.675428 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.675488 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.675510 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.778102 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.778165 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.778185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.778234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.778252 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.881233 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.881304 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.881324 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.881350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.881369 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.984147 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.984191 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.984201 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.984217 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:58 crc kubenswrapper[4911]: I1201 00:08:58.984228 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:58Z","lastTransitionTime":"2025-12-01T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.086713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.086768 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.086786 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.086813 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.086830 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.151248 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:08:59 crc kubenswrapper[4911]: E1201 00:08:59.151495 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.189207 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.189269 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.189294 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.189324 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.189350 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.292664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.292713 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.292730 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.292753 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.292772 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.396648 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.396714 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.396725 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.396747 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.396765 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.500520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.500583 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.500601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.500628 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.500647 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.604107 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.604168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.604191 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.604221 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.604243 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.707067 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.707143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.707158 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.707187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.707203 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.811136 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.811259 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.811277 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.811301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.811321 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.913815 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.913856 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.913871 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.913891 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:08:59 crc kubenswrapper[4911]: I1201 00:08:59.913904 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:08:59Z","lastTransitionTime":"2025-12-01T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.016667 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.016741 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.016759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.016787 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.016807 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.120009 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.120061 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.120079 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.120104 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.120121 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.151742 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.151770 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.151807 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.151942 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.152653 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.152794 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.153192 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.153539 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.202709 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2rl2" podStartSLOduration=78.202634943 podStartE2EDuration="1m18.202634943s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.181314883 +0000 UTC m=+100.320011694" watchObservedRunningTime="2025-12-01 00:09:00.202634943 +0000 UTC m=+100.341331754" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.224150 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.224239 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.224258 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.224281 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.224298 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.231417 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.231382108 podStartE2EDuration="17.231382108s" podCreationTimestamp="2025-12-01 00:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.202935812 +0000 UTC m=+100.341632613" watchObservedRunningTime="2025-12-01 00:09:00.231382108 +0000 UTC m=+100.370078929" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.231726 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.231713599 podStartE2EDuration="1m21.231713599s" podCreationTimestamp="2025-12-01 00:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.230341795 +0000 UTC m=+100.369038596" watchObservedRunningTime="2025-12-01 00:09:00.231713599 +0000 UTC m=+100.370410410" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.269350 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pt7lz" podStartSLOduration=79.269323437 podStartE2EDuration="1m19.269323437s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.269029458 +0000 UTC m=+100.407726249" watchObservedRunningTime="2025-12-01 00:09:00.269323437 +0000 UTC m=+100.408020238" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.300893 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hmfxk" podStartSLOduration=79.300862512 podStartE2EDuration="1m19.300862512s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.299336233 +0000 UTC m=+100.438033034" watchObservedRunningTime="2025-12-01 00:09:00.300862512 +0000 UTC m=+100.439559323" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.326863 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.326917 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.326935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.326963 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.326982 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.339981 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podStartSLOduration=79.339957748 podStartE2EDuration="1m19.339957748s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.322122659 +0000 UTC m=+100.460819450" watchObservedRunningTime="2025-12-01 00:09:00.339957748 +0000 UTC m=+100.478654529" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.340753 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.340746523 podStartE2EDuration="51.340746523s" podCreationTimestamp="2025-12-01 00:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.339345738 +0000 UTC m=+100.478042559" watchObservedRunningTime="2025-12-01 00:09:00.340746523 +0000 UTC m=+100.479443304" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.428593 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h54fr" podStartSLOduration=79.428562291 podStartE2EDuration="1m19.428562291s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.391169219 +0000 UTC m=+100.529866080" watchObservedRunningTime="2025-12-01 00:09:00.428562291 +0000 UTC m=+100.567259062" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.433946 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.433988 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.434001 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.434042 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.434055 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.474832 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.474802684 podStartE2EDuration="1m21.474802684s" podCreationTimestamp="2025-12-01 00:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.473892995 +0000 UTC m=+100.612589796" watchObservedRunningTime="2025-12-01 00:09:00.474802684 +0000 UTC m=+100.613499485" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.513641 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.51360887 podStartE2EDuration="1m17.51360887s" podCreationTimestamp="2025-12-01 00:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.494618255 +0000 UTC m=+100.633315086" watchObservedRunningTime="2025-12-01 00:09:00.51360887 +0000 UTC m=+100.652305641" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.537073 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.537168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.537199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.537234 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.537258 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.549113 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8ml8w" podStartSLOduration=79.549060059 podStartE2EDuration="1m19.549060059s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:00.547022355 +0000 UTC m=+100.685719126" watchObservedRunningTime="2025-12-01 00:09:00.549060059 +0000 UTC m=+100.687756870" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.564203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.564440 4911 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:09:00 crc kubenswrapper[4911]: E1201 00:09:00.564565 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs podName:10941e4a-3eac-4ef3-a814-c83adcea347e nodeName:}" failed. No retries permitted until 2025-12-01 00:10:04.564539033 +0000 UTC m=+164.703235844 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs") pod "network-metrics-daemon-bzs4g" (UID: "10941e4a-3eac-4ef3-a814-c83adcea347e") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.640163 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.640223 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.640240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.640268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.640288 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.744112 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.744193 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.744218 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.744249 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.744273 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.847388 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.847495 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.847519 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.847545 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.847564 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.950992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.951058 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.951076 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.951101 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:00 crc kubenswrapper[4911]: I1201 00:09:00.951119 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:00Z","lastTransitionTime":"2025-12-01T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.054399 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.054482 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.054500 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.054524 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.054541 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.150838 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:01 crc kubenswrapper[4911]: E1201 00:09:01.151025 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.157541 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.157586 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.157600 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.157619 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.157633 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.261252 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.261299 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.261323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.261350 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.261373 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.365665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.365729 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.365769 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.365794 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.365813 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.468952 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.468992 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.469003 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.469026 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.469038 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.572380 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.572432 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.572447 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.572520 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.572537 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.675924 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.675983 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.675994 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.676012 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.676022 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.778518 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.778567 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.778576 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.778594 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.778607 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.881091 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.881151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.881166 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.881189 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.881205 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.984447 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.984564 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.984588 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.984616 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:01 crc kubenswrapper[4911]: I1201 00:09:01.984635 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:01Z","lastTransitionTime":"2025-12-01T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.087601 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.087663 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.087674 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.087695 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.087709 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.151073 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.151072 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.151234 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:02 crc kubenswrapper[4911]: E1201 00:09:02.151397 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:02 crc kubenswrapper[4911]: E1201 00:09:02.151590 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:02 crc kubenswrapper[4911]: E1201 00:09:02.151724 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.191151 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.191196 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.191209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.191227 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.191239 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.294858 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.294916 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.294933 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.294956 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.294972 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.397681 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.397745 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.397772 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.397803 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.397827 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.501268 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.501323 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.501340 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.501365 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.501381 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.606110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.606161 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.606177 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.606199 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.606217 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.709398 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.709456 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.709506 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.709531 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.709549 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.813089 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.813171 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.813197 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.813235 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.813256 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.916575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.916646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.916675 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.916707 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:02 crc kubenswrapper[4911]: I1201 00:09:02.916729 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:02Z","lastTransitionTime":"2025-12-01T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.020119 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.020168 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.020184 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.020209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.020225 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.123105 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.123223 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.123248 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.123282 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.123306 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.150775 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:03 crc kubenswrapper[4911]: E1201 00:09:03.151108 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.226688 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.226770 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.226793 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.226831 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.226856 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.330301 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.330530 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.330556 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.330582 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.330603 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.433608 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.433694 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.433715 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.433748 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.433775 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.537732 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.537797 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.537821 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.537852 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.537871 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.641309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.641359 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.641377 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.641401 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.641419 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.745066 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.745157 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.745187 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.745233 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.745256 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.848422 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.848517 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.848536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.848562 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.848589 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.952646 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.952709 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.952726 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.952752 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:03 crc kubenswrapper[4911]: I1201 00:09:03.952770 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:03Z","lastTransitionTime":"2025-12-01T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.056117 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.056185 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.056209 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.056240 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.056264 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.151781 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.151848 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:04 crc kubenswrapper[4911]: E1201 00:09:04.152005 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.152107 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:04 crc kubenswrapper[4911]: E1201 00:09:04.152305 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:04 crc kubenswrapper[4911]: E1201 00:09:04.152371 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.159536 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.159575 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.159589 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.159605 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.159620 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.262882 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.262975 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.262996 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.263057 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.263076 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.366612 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.366656 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.366668 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.366684 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.366696 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.470386 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.470435 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.470446 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.470494 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.470508 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.573288 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.573376 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.573395 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.573420 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.573438 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.676247 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.676309 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.676326 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.676351 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.676367 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.779664 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.779759 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.779780 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.779814 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.779836 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.883110 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.883172 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.883194 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.883223 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.883242 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.987195 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.987264 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.987283 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.987311 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:04 crc kubenswrapper[4911]: I1201 00:09:04.987329 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:04Z","lastTransitionTime":"2025-12-01T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.091540 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.091616 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.091638 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.091665 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.091685 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:05Z","lastTransitionTime":"2025-12-01T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.151729 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:05 crc kubenswrapper[4911]: E1201 00:09:05.151917 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.195833 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.195907 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.195935 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.195970 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.195995 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:05Z","lastTransitionTime":"2025-12-01T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.198049 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.198123 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.198143 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.198174 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.198232 4911 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T00:09:05Z","lastTransitionTime":"2025-12-01T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.271172 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb"] Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.271956 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.276083 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.276329 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.276451 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.276598 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.324500 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.324622 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6864e593-021d-478a-b16a-0d592880189c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.324671 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6864e593-021d-478a-b16a-0d592880189c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.324707 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.324758 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6864e593-021d-478a-b16a-0d592880189c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457333 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457417 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6864e593-021d-478a-b16a-0d592880189c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457506 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457557 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457591 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6864e593-021d-478a-b16a-0d592880189c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457624 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6864e593-021d-478a-b16a-0d592880189c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.457671 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6864e593-021d-478a-b16a-0d592880189c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.459175 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6864e593-021d-478a-b16a-0d592880189c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.467516 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6864e593-021d-478a-b16a-0d592880189c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.487842 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6864e593-021d-478a-b16a-0d592880189c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h8jtb\" (UID: \"6864e593-021d-478a-b16a-0d592880189c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: I1201 00:09:05.600014 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" Dec 01 00:09:05 crc kubenswrapper[4911]: W1201 00:09:05.622678 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6864e593_021d_478a_b16a_0d592880189c.slice/crio-6f3f6aacd884a3eb96ebea763098d6392ac4b6e8c159f1b3475e2edb64c8f558 WatchSource:0}: Error finding container 6f3f6aacd884a3eb96ebea763098d6392ac4b6e8c159f1b3475e2edb64c8f558: Status 404 returned error can't find the container with id 6f3f6aacd884a3eb96ebea763098d6392ac4b6e8c159f1b3475e2edb64c8f558 Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.114754 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" event={"ID":"6864e593-021d-478a-b16a-0d592880189c","Type":"ContainerStarted","Data":"50bf11d7ab62c1786a0afec7bb9cdaccf116b179e8ed234196d09ec22edc0fc0"} Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.114827 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" event={"ID":"6864e593-021d-478a-b16a-0d592880189c","Type":"ContainerStarted","Data":"6f3f6aacd884a3eb96ebea763098d6392ac4b6e8c159f1b3475e2edb64c8f558"} Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.137118 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h8jtb" podStartSLOduration=85.137088021 podStartE2EDuration="1m25.137088021s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:06.135861192 +0000 UTC m=+106.274558073" watchObservedRunningTime="2025-12-01 00:09:06.137088021 +0000 UTC m=+106.275784832" Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.152706 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.152928 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:06 crc kubenswrapper[4911]: E1201 00:09:06.153055 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:06 crc kubenswrapper[4911]: I1201 00:09:06.153752 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:06 crc kubenswrapper[4911]: E1201 00:09:06.154090 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:06 crc kubenswrapper[4911]: E1201 00:09:06.154232 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:07 crc kubenswrapper[4911]: I1201 00:09:07.150768 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:07 crc kubenswrapper[4911]: E1201 00:09:07.151427 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:08 crc kubenswrapper[4911]: I1201 00:09:08.151154 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:08 crc kubenswrapper[4911]: I1201 00:09:08.151243 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:08 crc kubenswrapper[4911]: I1201 00:09:08.152213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:08 crc kubenswrapper[4911]: E1201 00:09:08.152426 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:08 crc kubenswrapper[4911]: E1201 00:09:08.152583 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:08 crc kubenswrapper[4911]: E1201 00:09:08.152707 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:09 crc kubenswrapper[4911]: I1201 00:09:09.151574 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:09 crc kubenswrapper[4911]: E1201 00:09:09.152480 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:10 crc kubenswrapper[4911]: I1201 00:09:10.151757 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:10 crc kubenswrapper[4911]: I1201 00:09:10.151783 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:10 crc kubenswrapper[4911]: I1201 00:09:10.151852 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:10 crc kubenswrapper[4911]: E1201 00:09:10.153537 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:10 crc kubenswrapper[4911]: E1201 00:09:10.153726 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:10 crc kubenswrapper[4911]: E1201 00:09:10.153850 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:11 crc kubenswrapper[4911]: I1201 00:09:11.151142 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:11 crc kubenswrapper[4911]: E1201 00:09:11.151673 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:12 crc kubenswrapper[4911]: I1201 00:09:12.151514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:12 crc kubenswrapper[4911]: E1201 00:09:12.151708 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:12 crc kubenswrapper[4911]: I1201 00:09:12.151809 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:12 crc kubenswrapper[4911]: E1201 00:09:12.152081 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:12 crc kubenswrapper[4911]: I1201 00:09:12.152116 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:12 crc kubenswrapper[4911]: E1201 00:09:12.152195 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:13 crc kubenswrapper[4911]: I1201 00:09:13.151800 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:13 crc kubenswrapper[4911]: E1201 00:09:13.152925 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:13 crc kubenswrapper[4911]: I1201 00:09:13.153169 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:09:13 crc kubenswrapper[4911]: E1201 00:09:13.153492 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ptrhz_openshift-ovn-kubernetes(d8af6f05-3ccd-4b80-b144-530b83bfdc62)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" Dec 01 00:09:14 crc kubenswrapper[4911]: I1201 00:09:14.150710 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:14 crc kubenswrapper[4911]: E1201 00:09:14.150932 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:14 crc kubenswrapper[4911]: I1201 00:09:14.151051 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:14 crc kubenswrapper[4911]: E1201 00:09:14.151220 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:14 crc kubenswrapper[4911]: I1201 00:09:14.151372 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:14 crc kubenswrapper[4911]: E1201 00:09:14.151727 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.151030 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:15 crc kubenswrapper[4911]: E1201 00:09:15.151189 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.155418 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/1.log" Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.156077 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/0.log" Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.156165 4911 generic.go:334] "Generic (PLEG): container finished" podID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" containerID="44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77" exitCode=1 Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.156249 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerDied","Data":"44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77"} Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.156374 4911 scope.go:117] "RemoveContainer" containerID="500db242953960fd18ac4a256812782130c981d733a772db5a12fbaa19ca44ca" Dec 01 00:09:15 crc kubenswrapper[4911]: I1201 00:09:15.156953 4911 scope.go:117] "RemoveContainer" containerID="44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77" Dec 01 00:09:15 crc kubenswrapper[4911]: E1201 00:09:15.157188 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-h54fr_openshift-multus(0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f)\"" pod="openshift-multus/multus-h54fr" podUID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" Dec 01 00:09:16 crc kubenswrapper[4911]: I1201 00:09:16.151199 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:16 crc kubenswrapper[4911]: I1201 00:09:16.151220 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:16 crc kubenswrapper[4911]: I1201 00:09:16.151421 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:16 crc kubenswrapper[4911]: E1201 00:09:16.151636 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:16 crc kubenswrapper[4911]: E1201 00:09:16.151890 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:16 crc kubenswrapper[4911]: E1201 00:09:16.152191 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:16 crc kubenswrapper[4911]: I1201 00:09:16.162224 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/1.log" Dec 01 00:09:17 crc kubenswrapper[4911]: I1201 00:09:17.151079 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:17 crc kubenswrapper[4911]: E1201 00:09:17.151332 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:18 crc kubenswrapper[4911]: I1201 00:09:18.151436 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:18 crc kubenswrapper[4911]: I1201 00:09:18.151432 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:18 crc kubenswrapper[4911]: I1201 00:09:18.151539 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:18 crc kubenswrapper[4911]: E1201 00:09:18.152144 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:18 crc kubenswrapper[4911]: E1201 00:09:18.152281 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:18 crc kubenswrapper[4911]: E1201 00:09:18.152700 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:19 crc kubenswrapper[4911]: I1201 00:09:19.151618 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:19 crc kubenswrapper[4911]: E1201 00:09:19.151826 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:20 crc kubenswrapper[4911]: E1201 00:09:20.120925 4911 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 00:09:20 crc kubenswrapper[4911]: I1201 00:09:20.151221 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:20 crc kubenswrapper[4911]: I1201 00:09:20.151352 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:20 crc kubenswrapper[4911]: E1201 00:09:20.153643 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:20 crc kubenswrapper[4911]: I1201 00:09:20.153694 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:20 crc kubenswrapper[4911]: E1201 00:09:20.154051 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:20 crc kubenswrapper[4911]: E1201 00:09:20.154183 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:20 crc kubenswrapper[4911]: E1201 00:09:20.269361 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 00:09:21 crc kubenswrapper[4911]: I1201 00:09:21.151014 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:21 crc kubenswrapper[4911]: E1201 00:09:21.151160 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:22 crc kubenswrapper[4911]: I1201 00:09:22.151193 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:22 crc kubenswrapper[4911]: I1201 00:09:22.151359 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:22 crc kubenswrapper[4911]: E1201 00:09:22.151523 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:22 crc kubenswrapper[4911]: I1201 00:09:22.151548 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:22 crc kubenswrapper[4911]: E1201 00:09:22.151754 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:22 crc kubenswrapper[4911]: E1201 00:09:22.151858 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:23 crc kubenswrapper[4911]: I1201 00:09:23.150987 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:23 crc kubenswrapper[4911]: E1201 00:09:23.151159 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:24 crc kubenswrapper[4911]: I1201 00:09:24.151138 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:24 crc kubenswrapper[4911]: I1201 00:09:24.151196 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:24 crc kubenswrapper[4911]: I1201 00:09:24.151278 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:24 crc kubenswrapper[4911]: E1201 00:09:24.152127 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:24 crc kubenswrapper[4911]: E1201 00:09:24.152222 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:24 crc kubenswrapper[4911]: E1201 00:09:24.151890 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:25 crc kubenswrapper[4911]: I1201 00:09:25.151055 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:25 crc kubenswrapper[4911]: E1201 00:09:25.151238 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:25 crc kubenswrapper[4911]: E1201 00:09:25.270515 4911 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 00:09:26 crc kubenswrapper[4911]: I1201 00:09:26.150897 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:26 crc kubenswrapper[4911]: I1201 00:09:26.150984 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:26 crc kubenswrapper[4911]: I1201 00:09:26.150905 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:26 crc kubenswrapper[4911]: E1201 00:09:26.151106 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:26 crc kubenswrapper[4911]: E1201 00:09:26.151805 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:26 crc kubenswrapper[4911]: I1201 00:09:26.152448 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:09:26 crc kubenswrapper[4911]: I1201 00:09:26.152585 4911 scope.go:117] "RemoveContainer" containerID="44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77" Dec 01 00:09:26 crc kubenswrapper[4911]: E1201 00:09:26.153248 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.144654 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzs4g"] Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.146047 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:27 crc kubenswrapper[4911]: E1201 00:09:27.146452 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.151309 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:27 crc kubenswrapper[4911]: E1201 00:09:27.151525 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.215930 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/1.log" Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.216063 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerStarted","Data":"1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5"} Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.219284 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/3.log" Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.225119 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerStarted","Data":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} Dec 01 00:09:27 crc kubenswrapper[4911]: I1201 00:09:27.225728 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:09:28 crc kubenswrapper[4911]: I1201 00:09:28.151306 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:28 crc kubenswrapper[4911]: I1201 00:09:28.151347 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:28 crc kubenswrapper[4911]: E1201 00:09:28.151920 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:28 crc kubenswrapper[4911]: E1201 00:09:28.152047 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:29 crc kubenswrapper[4911]: I1201 00:09:29.150832 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:29 crc kubenswrapper[4911]: I1201 00:09:29.150864 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:29 crc kubenswrapper[4911]: E1201 00:09:29.151053 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzs4g" podUID="10941e4a-3eac-4ef3-a814-c83adcea347e" Dec 01 00:09:29 crc kubenswrapper[4911]: E1201 00:09:29.151232 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 00:09:30 crc kubenswrapper[4911]: I1201 00:09:30.151363 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:30 crc kubenswrapper[4911]: I1201 00:09:30.151382 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:30 crc kubenswrapper[4911]: E1201 00:09:30.153606 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 00:09:30 crc kubenswrapper[4911]: E1201 00:09:30.153823 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.151269 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.151296 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.155231 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.155563 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.157130 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 00:09:31 crc kubenswrapper[4911]: I1201 00:09:31.158006 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 00:09:32 crc kubenswrapper[4911]: I1201 00:09:32.151046 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:32 crc kubenswrapper[4911]: I1201 00:09:32.151197 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:32 crc kubenswrapper[4911]: I1201 00:09:32.154542 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 00:09:32 crc kubenswrapper[4911]: I1201 00:09:32.154814 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.579679 4911 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.627384 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podStartSLOduration=114.627359465 podStartE2EDuration="1m54.627359465s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:27.290822265 +0000 UTC m=+127.429519036" watchObservedRunningTime="2025-12-01 00:09:35.627359465 +0000 UTC m=+135.766056266" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.630122 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.630928 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.634390 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.634746 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.637956 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.638303 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.638563 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j8fnx"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.638712 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.638982 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.639206 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.639316 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.639736 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.640370 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.640404 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.640612 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.640866 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.641218 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.641804 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.642271 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.643536 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.644129 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.650207 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-l6g55"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.651671 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.653578 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tj98q"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.654204 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.654964 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.662610 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663239 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663343 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663438 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663611 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663886 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664025 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664047 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664097 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664156 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664176 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664230 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664263 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664275 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.663622 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664376 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664383 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664428 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664377 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664507 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664548 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664558 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664628 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664650 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664692 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664649 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.664649 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.665313 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.667271 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.667985 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.669676 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.676819 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6w6x2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.677703 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.678439 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.678588 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680422 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.682517 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4j8rl"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680593 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680639 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680698 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680752 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680768 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.685350 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680883 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680890 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.680998 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.681109 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.681145 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.689523 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.690194 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.690508 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zph9g"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.690874 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.691302 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.691366 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.696096 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-788ks"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.696891 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.697605 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.698167 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.698484 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29409120-hbwz5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.699114 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.701667 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.702096 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.702584 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.703108 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.709132 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.709600 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711735 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711788 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711831 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711862 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711892 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgq6n\" (UniqueName: \"kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.711991 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.712025 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.712060 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.712097 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7gzm\" (UniqueName: \"kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.722447 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.723248 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.723825 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.724302 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.724587 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.724717 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.724735 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725028 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725241 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725447 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725580 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725777 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.725588 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.734074 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.736004 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.736531 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.741040 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743546 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743574 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743784 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743854 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743902 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743941 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744005 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744018 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744100 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744104 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744130 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744233 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744238 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.744292 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.745050 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.746828 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.747657 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.747737 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.747970 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748129 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748314 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748469 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748540 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.743791 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748774 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.748855 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.749096 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.749127 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.749444 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.754799 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.754954 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.755062 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.755126 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.755342 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.755773 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.756157 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.756763 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.757419 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.757762 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9qfhm"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.758092 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.758253 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.759613 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.759925 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ckndj"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.761124 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.761655 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.761126 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.766325 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.766918 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.776347 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2dh9"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.777020 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.777546 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.777815 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.783569 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96ztf"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.784149 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.784682 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.790485 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.791111 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.791557 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.795339 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.795545 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.795969 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5twfs"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.797550 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfg64"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.798177 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.799109 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.799263 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.801751 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.808311 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.810689 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812308 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812783 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812812 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gnrs\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-kube-api-access-8gnrs\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812841 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-service-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812861 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812878 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-auth-proxy-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812894 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.812896 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgq6n\" (UniqueName: \"kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813116 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-images\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813136 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813155 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813337 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813409 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgbfm\" (UniqueName: \"kubernetes.io/projected/cf127d54-ebc4-4ac9-9019-0f62813f311d-kube-api-access-pgbfm\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813437 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg6vn\" (UniqueName: \"kubernetes.io/projected/afd71e8f-dc8d-4906-9c94-e49ba1738d00-kube-api-access-pg6vn\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813474 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49qf\" (UniqueName: \"kubernetes.io/projected/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-kube-api-access-j49qf\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813502 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813667 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c65fab7f-8379-4eba-85dd-eb90f2f71100-machine-approver-tls\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813692 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvmq\" (UniqueName: \"kubernetes.io/projected/067e922f-4238-434f-b747-e8adb2c478c0-kube-api-access-ldvmq\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813713 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-config\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813733 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813750 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813768 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813792 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7gzm\" (UniqueName: \"kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813809 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-serving-cert\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813827 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4sq\" (UniqueName: \"kubernetes.io/projected/91ebd18d-f20f-4d2c-9607-9fedbf105795-kube-api-access-wc4sq\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813843 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-client\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813877 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813892 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91ebd18d-f20f-4d2c-9607-9fedbf105795-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813913 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813928 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/067e922f-4238-434f-b747-e8adb2c478c0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813944 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.813973 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91ebd18d-f20f-4d2c-9607-9fedbf105795-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814001 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814018 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814058 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-config\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814073 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e03a3bd-26ee-4772-9ff5-d802750daa8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814087 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbhb2\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-kube-api-access-xbhb2\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814117 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgss5\" (UniqueName: \"kubernetes.io/projected/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-kube-api-access-fgss5\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814132 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afd71e8f-dc8d-4906-9c94-e49ba1738d00-metrics-tls\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814162 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8g9p\" (UniqueName: \"kubernetes.io/projected/c65fab7f-8379-4eba-85dd-eb90f2f71100-kube-api-access-f8g9p\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814177 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-config\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814202 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814218 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-config\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814233 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03a3bd-26ee-4772-9ff5-d802750daa8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814250 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf127d54-ebc4-4ac9-9019-0f62813f311d-serving-cert\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814265 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4878bb4-63b2-481b-8055-dc5d69809b39-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814280 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzwb\" (UniqueName: \"kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814294 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814309 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814325 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814340 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrts\" (UniqueName: \"kubernetes.io/projected/f4878bb4-63b2-481b-8055-dc5d69809b39-kube-api-access-kvrts\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.814356 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.815477 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.815890 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.816074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.816749 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.819388 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.819950 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.820822 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.822451 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.823397 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.829516 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.831210 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.831278 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.834429 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.835741 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4j8rl"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.836656 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.837324 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.837599 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j8fnx"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.839439 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.841552 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29409120-hbwz5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.852618 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-96zzw"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.853685 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.855130 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82mw2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.859082 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.859143 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.860654 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.861158 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6w6x2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.862664 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.864699 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tj98q"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.867743 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96ztf"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.868079 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.870217 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.870339 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.871375 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.872871 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2dh9"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.874001 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.874206 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.876345 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.877538 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.878527 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.879487 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zph9g"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.880678 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.882662 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.883672 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96zzw"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.884633 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.885723 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ckndj"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.886742 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-788ks"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.887724 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9qfhm"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.888715 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.889822 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4v62n"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.890643 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.890858 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.891816 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfg64"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.892783 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.893776 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.895098 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82mw2"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.896144 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5twfs"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.897214 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.899040 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.899777 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4v62n"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.903051 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.905879 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.907422 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zj9mr"] Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.908942 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.913374 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.914953 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915067 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915154 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgbfm\" (UniqueName: \"kubernetes.io/projected/cf127d54-ebc4-4ac9-9019-0f62813f311d-kube-api-access-pgbfm\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915246 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6vn\" (UniqueName: \"kubernetes.io/projected/afd71e8f-dc8d-4906-9c94-e49ba1738d00-kube-api-access-pg6vn\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915323 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49qf\" (UniqueName: \"kubernetes.io/projected/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-kube-api-access-j49qf\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c65fab7f-8379-4eba-85dd-eb90f2f71100-machine-approver-tls\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915474 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvmq\" (UniqueName: \"kubernetes.io/projected/067e922f-4238-434f-b747-e8adb2c478c0-kube-api-access-ldvmq\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915545 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-config\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915612 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915690 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915759 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915806 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915883 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-serving-cert\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915564 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.915949 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4sq\" (UniqueName: \"kubernetes.io/projected/91ebd18d-f20f-4d2c-9607-9fedbf105795-kube-api-access-wc4sq\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916088 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-client\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916165 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91ebd18d-f20f-4d2c-9607-9fedbf105795-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-config\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916239 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/067e922f-4238-434f-b747-e8adb2c478c0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916437 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916526 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916596 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91ebd18d-f20f-4d2c-9607-9fedbf105795-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916683 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916763 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-config\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916820 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-service-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916834 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e03a3bd-26ee-4772-9ff5-d802750daa8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916985 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbhb2\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-kube-api-access-xbhb2\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afd71e8f-dc8d-4906-9c94-e49ba1738d00-metrics-tls\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917179 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgss5\" (UniqueName: \"kubernetes.io/projected/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-kube-api-access-fgss5\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917247 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-config\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917319 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8g9p\" (UniqueName: \"kubernetes.io/projected/c65fab7f-8379-4eba-85dd-eb90f2f71100-kube-api-access-f8g9p\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917394 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917485 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-config\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917570 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03a3bd-26ee-4772-9ff5-d802750daa8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917645 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf127d54-ebc4-4ac9-9019-0f62813f311d-serving-cert\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917723 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4878bb4-63b2-481b-8055-dc5d69809b39-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917800 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzwb\" (UniqueName: \"kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917878 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917951 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918019 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrts\" (UniqueName: \"kubernetes.io/projected/f4878bb4-63b2-481b-8055-dc5d69809b39-kube-api-access-kvrts\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918090 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918162 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gnrs\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-kube-api-access-8gnrs\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918240 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918243 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-service-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918301 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918301 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-config\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-auth-proxy-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918355 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-images\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918375 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.918719 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c65fab7f-8379-4eba-85dd-eb90f2f71100-machine-approver-tls\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.919108 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-service-ca\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.916895 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.919519 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91ebd18d-f20f-4d2c-9607-9fedbf105795-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.919764 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03a3bd-26ee-4772-9ff5-d802750daa8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.920357 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-config\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.920847 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/afd71e8f-dc8d-4906-9c94-e49ba1738d00-metrics-tls\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.920913 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c65fab7f-8379-4eba-85dd-eb90f2f71100-auth-proxy-config\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917689 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf127d54-ebc4-4ac9-9019-0f62813f311d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917958 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-config\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.917023 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91ebd18d-f20f-4d2c-9607-9fedbf105795-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.921571 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.921760 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4878bb4-63b2-481b-8055-dc5d69809b39-images\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.922005 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.922072 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf127d54-ebc4-4ac9-9019-0f62813f311d-serving-cert\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.922524 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4878bb4-63b2-481b-8055-dc5d69809b39-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.922611 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-etcd-client\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.922962 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.923742 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.924171 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/067e922f-4238-434f-b747-e8adb2c478c0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.927596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-serving-cert\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.928184 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e03a3bd-26ee-4772-9ff5-d802750daa8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.936178 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.953422 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.974002 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 00:09:35 crc kubenswrapper[4911]: I1201 00:09:35.994580 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.016830 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.034662 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.053258 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.074232 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.093589 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.114547 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.133485 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.174918 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.194587 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.214326 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221300 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221330 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-metrics-certs\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221381 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-encryption-config\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221399 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-default-certificate\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221424 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221477 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221517 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-images\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221549 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrf7\" (UniqueName: \"kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221599 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-client\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221621 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4vc\" (UniqueName: \"kubernetes.io/projected/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-kube-api-access-tf4vc\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221643 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-stats-auth\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221678 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lll4l\" (UniqueName: \"kubernetes.io/projected/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-kube-api-access-lll4l\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221711 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221733 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2hm7\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221754 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3daa9b4f-c005-418c-854c-a81a04ab607a-service-ca-bundle\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221790 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221814 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqr92\" (UniqueName: \"kubernetes.io/projected/269e7d1c-4918-4344-8427-58a6c9c1d8e7-kube-api-access-gqr92\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221837 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221896 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c86t\" (UniqueName: \"kubernetes.io/projected/3daa9b4f-c005-418c-854c-a81a04ab607a-kube-api-access-5c86t\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221927 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221956 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzxf\" (UniqueName: \"kubernetes.io/projected/319f8b92-7275-4984-aaf1-4b04d926062f-kube-api-access-4tzxf\") pod \"migrator-59844c95c7-gsckt\" (UID: \"319f8b92-7275-4984-aaf1-4b04d926062f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221976 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/269e7d1c-4918-4344-8427-58a6c9c1d8e7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.221998 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222016 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-dir\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222061 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-proxy-tls\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222092 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222118 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-policies\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.222137 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-serving-cert\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.222162 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:36.722147821 +0000 UTC m=+136.860844592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.233765 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.253672 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.273756 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.302330 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.314316 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323369 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.323585 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:36.823550322 +0000 UTC m=+136.962247103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323722 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-dir\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323779 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgx2t\" (UniqueName: \"kubernetes.io/projected/65c9bf23-09ab-404c-acbd-21fb12f9f441-kube-api-access-mgx2t\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323816 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-oauth-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323857 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-dir\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323867 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-policies\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323930 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.323957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324001 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-cabundle\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324027 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324077 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-trusted-ca\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324149 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324444 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-audit-policies\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324829 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324902 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-socket-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.324991 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-metrics-certs\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325657 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325710 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11323f-4d92-45ac-86e3-880c8437bfbb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325747 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325794 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325868 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-encryption-config\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.325902 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326049 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326339 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326413 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5fg7\" (UniqueName: \"kubernetes.io/projected/5f81c0bb-de86-40af-8412-ceb41bf9478e-kube-api-access-k5fg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326515 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cft4r\" (UniqueName: \"kubernetes.io/projected/132c1139-f0a6-4ca4-97db-bc89244c6b26-kube-api-access-cft4r\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326610 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326698 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg7l2\" (UniqueName: \"kubernetes.io/projected/92365c00-de26-4e68-89d2-724cf199e249-kube-api-access-pg7l2\") pod \"downloads-7954f5f757-ckndj\" (UID: \"92365c00-de26-4e68-89d2-724cf199e249\") " pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326810 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326858 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-registration-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.326982 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-client\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327048 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-stats-auth\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327576 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327656 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4fkn\" (UniqueName: \"kubernetes.io/projected/017917e8-8480-473b-858b-46626ef5f770-kube-api-access-d4fkn\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327735 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lll4l\" (UniqueName: \"kubernetes.io/projected/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-kube-api-access-lll4l\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327771 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-console-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.327805 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit-dir\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328017 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328128 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328301 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-srv-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/65c9bf23-09ab-404c-acbd-21fb12f9f441-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328484 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f47c16a-94f6-48e8-8757-ffea1b773ec8-serving-cert\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328517 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-client\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328559 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-serving-cert\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328589 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-trusted-ca-bundle\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328627 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328670 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328704 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2t2\" (UniqueName: \"kubernetes.io/projected/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-kube-api-access-gt2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328738 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328769 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-webhook-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328800 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c86t\" (UniqueName: \"kubernetes.io/projected/3daa9b4f-c005-418c-854c-a81a04ab607a-kube-api-access-5c86t\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.328997 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329064 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329170 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdncd\" (UniqueName: \"kubernetes.io/projected/2722c6c0-aedc-4764-8e11-7a7b5b694151-kube-api-access-rdncd\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329211 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329376 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329417 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329436 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329582 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzxf\" (UniqueName: \"kubernetes.io/projected/319f8b92-7275-4984-aaf1-4b04d926062f-kube-api-access-4tzxf\") pod \"migrator-59844c95c7-gsckt\" (UID: \"319f8b92-7275-4984-aaf1-4b04d926062f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329628 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwn8\" (UniqueName: \"kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329714 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-proxy-tls\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329749 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-metrics-certs\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329798 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-serving-cert\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329909 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.329957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-csi-data-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330012 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-service-ca\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330060 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e968b321-636f-418d-b788-445a0b3cc2a2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330099 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8w46\" (UniqueName: \"kubernetes.io/projected/dbd57385-607f-4585-853f-f7ab3b4dd18d-kube-api-access-v8w46\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330137 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330168 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330503 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-config\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330550 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330597 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330674 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330712 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-default-certificate\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330745 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grvj\" (UniqueName: \"kubernetes.io/projected/4e11323f-4d92-45ac-86e3-880c8437bfbb-kube-api-access-8grvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.330781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd57385-607f-4585-853f-f7ab3b4dd18d-serving-cert\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.331075 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcst9\" (UniqueName: \"kubernetes.io/projected/197d08f0-6e72-49e1-8e05-bc571808c8d3-kube-api-access-rcst9\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.331759 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrf7\" (UniqueName: \"kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332084 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-images\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332091 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332136 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4vc\" (UniqueName: \"kubernetes.io/projected/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-kube-api-access-tf4vc\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332481 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmjqz\" (UniqueName: \"kubernetes.io/projected/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-kube-api-access-hmjqz\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332513 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hh47\" (UniqueName: \"kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332541 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332645 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332795 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb62x\" (UniqueName: \"kubernetes.io/projected/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-kube-api-access-mb62x\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332831 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e968b321-636f-418d-b788-445a0b3cc2a2-config\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332859 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2hm7\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332884 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3daa9b4f-c005-418c-854c-a81a04ab607a-service-ca-bundle\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332911 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e11323f-4d92-45ac-86e3-880c8437bfbb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332935 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.332957 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-tmpfs\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333039 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333067 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-srv-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333068 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-images\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333093 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e968b321-636f-418d-b788-445a0b3cc2a2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333118 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333144 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqr92\" (UniqueName: \"kubernetes.io/projected/269e7d1c-4918-4344-8427-58a6c9c1d8e7-kube-api-access-gqr92\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333173 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqspr\" (UniqueName: \"kubernetes.io/projected/0cdb2706-8203-45ab-a34a-668f1d0dd300-kube-api-access-gqspr\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333195 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2722c6c0-aedc-4764-8e11-7a7b5b694151-proxy-tls\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333220 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2722c6c0-aedc-4764-8e11-7a7b5b694151-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.333761 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334175 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/b88051a1-6f40-46c1-b01e-78de96d4a909-kube-api-access-7frgp\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334215 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-key\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334244 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334271 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7649r\" (UniqueName: \"kubernetes.io/projected/2f47c16a-94f6-48e8-8757-ffea1b773ec8-kube-api-access-7649r\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334297 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-plugins-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334322 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxb68\" (UniqueName: \"kubernetes.io/projected/11919566-9901-4428-a994-c2af062e6b24-kube-api-access-sxb68\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334359 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334368 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334437 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-mountpoint-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334358 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-serving-cert\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334532 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334566 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334638 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334734 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwfm\" (UniqueName: \"kubernetes.io/projected/97b939bc-01a8-41d3-90be-642b0bf45a7b-kube-api-access-bhwfm\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334818 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334897 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxkx\" (UniqueName: \"kubernetes.io/projected/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-kube-api-access-7gxkx\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334955 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.334929 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-encryption-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335005 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-oauth-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335031 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335069 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/269e7d1c-4918-4344-8427-58a6c9c1d8e7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335104 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335125 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335150 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335174 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-image-import-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.335869 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.336054 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.336345 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.336493 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3daa9b4f-c005-418c-854c-a81a04ab607a-service-ca-bundle\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.336679 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:36.836657937 +0000 UTC m=+136.975354748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.336997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-etcd-client\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.337911 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-stats-auth\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.338533 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-proxy-tls\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.338720 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.339352 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-encryption-config\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.339480 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3daa9b4f-c005-418c-854c-a81a04ab607a-default-certificate\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.339599 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/269e7d1c-4918-4344-8427-58a6c9c1d8e7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.354308 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.374015 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.394044 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.422821 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.433542 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.435891 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.436071 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:36.936049984 +0000 UTC m=+137.074746765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e968b321-636f-418d-b788-445a0b3cc2a2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436154 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-srv-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436204 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqspr\" (UniqueName: \"kubernetes.io/projected/0cdb2706-8203-45ab-a34a-668f1d0dd300-kube-api-access-gqspr\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436236 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2722c6c0-aedc-4764-8e11-7a7b5b694151-proxy-tls\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436267 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2722c6c0-aedc-4764-8e11-7a7b5b694151-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436300 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/b88051a1-6f40-46c1-b01e-78de96d4a909-kube-api-access-7frgp\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436381 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-key\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436417 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7649r\" (UniqueName: \"kubernetes.io/projected/2f47c16a-94f6-48e8-8757-ffea1b773ec8-kube-api-access-7649r\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436447 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-plugins-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436520 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxb68\" (UniqueName: \"kubernetes.io/projected/11919566-9901-4428-a994-c2af062e6b24-kube-api-access-sxb68\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-mountpoint-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436631 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436660 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436694 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436729 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwfm\" (UniqueName: \"kubernetes.io/projected/97b939bc-01a8-41d3-90be-642b0bf45a7b-kube-api-access-bhwfm\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436764 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436803 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxkx\" (UniqueName: \"kubernetes.io/projected/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-kube-api-access-7gxkx\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436834 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436829 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-mountpoint-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436864 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-encryption-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436895 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-oauth-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436931 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-image-import-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.436966 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437001 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-oauth-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437033 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgx2t\" (UniqueName: \"kubernetes.io/projected/65c9bf23-09ab-404c-acbd-21fb12f9f441-kube-api-access-mgx2t\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437075 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437106 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437137 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-cabundle\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437168 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437199 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-trusted-ca\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437231 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437252 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-plugins-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437271 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437303 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-socket-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437351 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2722c6c0-aedc-4764-8e11-7a7b5b694151-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437368 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437424 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437486 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11323f-4d92-45ac-86e3-880c8437bfbb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437509 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437571 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437594 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437622 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5fg7\" (UniqueName: \"kubernetes.io/projected/5f81c0bb-de86-40af-8412-ceb41bf9478e-kube-api-access-k5fg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437645 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cft4r\" (UniqueName: \"kubernetes.io/projected/132c1139-f0a6-4ca4-97db-bc89244c6b26-kube-api-access-cft4r\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437670 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437700 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437722 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg7l2\" (UniqueName: \"kubernetes.io/projected/92365c00-de26-4e68-89d2-724cf199e249-kube-api-access-pg7l2\") pod \"downloads-7954f5f757-ckndj\" (UID: \"92365c00-de26-4e68-89d2-724cf199e249\") " pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437756 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-registration-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437795 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437836 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437859 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4fkn\" (UniqueName: \"kubernetes.io/projected/017917e8-8480-473b-858b-46626ef5f770-kube-api-access-d4fkn\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437898 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-console-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437923 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit-dir\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437976 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-srv-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438026 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/65c9bf23-09ab-404c-acbd-21fb12f9f441-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438058 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f47c16a-94f6-48e8-8757-ffea1b773ec8-serving-cert\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438086 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-client\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438110 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-serving-cert\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-trusted-ca-bundle\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438174 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438204 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2t2\" (UniqueName: \"kubernetes.io/projected/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-kube-api-access-gt2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438240 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-webhook-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438271 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438317 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438413 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdncd\" (UniqueName: \"kubernetes.io/projected/2722c6c0-aedc-4764-8e11-7a7b5b694151-kube-api-access-rdncd\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.438486 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.437990 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439064 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439381 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-registration-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439492 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439580 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-socket-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.439601 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:36.939571809 +0000 UTC m=+137.078268630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440534 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439719 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit-dir\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439944 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.439671 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440708 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwn8\" (UniqueName: \"kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-csi-data-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440952 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e968b321-636f-418d-b788-445a0b3cc2a2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.440986 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-service-ca\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8w46\" (UniqueName: \"kubernetes.io/projected/dbd57385-607f-4585-853f-f7ab3b4dd18d-kube-api-access-v8w46\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441120 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b88051a1-6f40-46c1-b01e-78de96d4a909-csi-data-dir\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441193 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441735 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441777 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-config\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441806 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441838 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441878 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441914 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grvj\" (UniqueName: \"kubernetes.io/projected/4e11323f-4d92-45ac-86e3-880c8437bfbb-kube-api-access-8grvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441952 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd57385-607f-4585-853f-f7ab3b4dd18d-serving-cert\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.441999 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcst9\" (UniqueName: \"kubernetes.io/projected/197d08f0-6e72-49e1-8e05-bc571808c8d3-kube-api-access-rcst9\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmjqz\" (UniqueName: \"kubernetes.io/projected/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-kube-api-access-hmjqz\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442110 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hh47\" (UniqueName: \"kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442142 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442175 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb62x\" (UniqueName: \"kubernetes.io/projected/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-kube-api-access-mb62x\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442207 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e968b321-636f-418d-b788-445a0b3cc2a2-config\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442255 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e11323f-4d92-45ac-86e3-880c8437bfbb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442283 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442390 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-tmpfs\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.442433 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.443300 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.443611 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-tmpfs\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.443968 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e968b321-636f-418d-b788-445a0b3cc2a2-config\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.444026 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-srv-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.444988 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.445448 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.445820 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.445960 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.446231 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/97b939bc-01a8-41d3-90be-642b0bf45a7b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.446641 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.446780 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.447355 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.447506 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.448658 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.448949 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-srv-cert\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.449452 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.449956 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.449988 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e968b321-636f-418d-b788-445a0b3cc2a2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.450260 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-webhook-cert\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.453113 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.461010 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-key\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.472671 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.494374 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.500158 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/132c1139-f0a6-4ca4-97db-bc89244c6b26-signing-cabundle\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.513506 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.534193 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.543198 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.543780 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.043759376 +0000 UTC m=+137.182456147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.544346 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.544756 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.044734382 +0000 UTC m=+137.183431183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.553599 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.565573 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f47c16a-94f6-48e8-8757-ffea1b773ec8-serving-cert\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.574104 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.583395 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-config\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.602788 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.610368 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f47c16a-94f6-48e8-8757-ffea1b773ec8-trusted-ca\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.614175 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.633318 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.646017 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.646280 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.146244896 +0000 UTC m=+137.284941697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.646664 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.647142 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.14712131 +0000 UTC m=+137.285818121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.653585 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.663394 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.673716 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.693749 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.701639 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.714546 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.733337 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.741090 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2722c6c0-aedc-4764-8e11-7a7b5b694151-proxy-tls\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.748605 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.748895 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.24886066 +0000 UTC m=+137.387557441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.749381 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.749765 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.249749344 +0000 UTC m=+137.388446115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.754912 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.774323 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.793613 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.805114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/65c9bf23-09ab-404c-acbd-21fb12f9f441-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.811692 4911 request.go:700] Waited for 1.012260851s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/configmaps?fieldSelector=metadata.name%3Doauth-serving-cert&limit=500&resourceVersion=0 Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.813574 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.820806 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-oauth-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.834536 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.851063 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.851256 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.351218857 +0000 UTC m=+137.489915668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.851874 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.852610 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.352581564 +0000 UTC m=+137.491278375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.854305 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.863517 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-client\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.874535 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.887284 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-serving-cert\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.893815 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.901936 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-encryption-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.913456 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.923473 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-config\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.933289 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.939445 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-audit\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.953437 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.953903 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.953950 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.453938824 +0000 UTC m=+137.592635595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.954626 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:36 crc kubenswrapper[4911]: E1201 00:09:36.954978 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.454970322 +0000 UTC m=+137.593667093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.961422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.973068 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 00:09:36 crc kubenswrapper[4911]: I1201 00:09:36.980568 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-image-import-ca\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.001400 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.003549 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.014370 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.034448 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.054273 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.055950 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.056509 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.556434064 +0000 UTC m=+137.695130965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.057584 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.058164 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.558144611 +0000 UTC m=+137.696841422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.067766 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-serving-cert\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.074746 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.083957 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/017917e8-8480-473b-858b-46626ef5f770-console-oauth-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.095168 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.102766 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-console-config\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.114736 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.122280 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-service-ca\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.143776 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.150935 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017917e8-8480-473b-858b-46626ef5f770-trusted-ca-bundle\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.160825 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.161540 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.661486764 +0000 UTC m=+137.800183545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.161860 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.163102 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.163716 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.663696444 +0000 UTC m=+137.802393235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.168902 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11323f-4d92-45ac-86e3-880c8437bfbb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.174065 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.194302 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.206042 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e11323f-4d92-45ac-86e3-880c8437bfbb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.214865 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.234358 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.264194 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.264418 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.764385136 +0000 UTC m=+137.903081947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.265168 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.265805 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.765743532 +0000 UTC m=+137.904440333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.273359 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.275839 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgq6n\" (UniqueName: \"kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n\") pod \"route-controller-manager-6576b87f9c-6nqdq\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.294339 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.323068 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.331964 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.333162 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.354618 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.364618 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.367239 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.367428 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.86740114 +0000 UTC m=+138.006097951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.367993 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.368562 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.868538821 +0000 UTC m=+138.007235642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.390433 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7gzm\" (UniqueName: \"kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm\") pod \"controller-manager-879f6c89f-ds2z5\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.413969 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.427789 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd57385-607f-4585-853f-f7ab3b4dd18d-serving-cert\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.434398 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.436670 4911 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.436820 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls podName:197d08f0-6e72-49e1-8e05-bc571808c8d3 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.936786502 +0000 UTC m=+138.075483273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls") pod "dns-default-4v62n" (UID: "197d08f0-6e72-49e1-8e05-bc571808c8d3") : failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439151 4911 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439172 4911 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439184 4911 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439205 4911 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439241 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume podName:197d08f0-6e72-49e1-8e05-bc571808c8d3 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.939201088 +0000 UTC m=+138.077897859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume") pod "dns-default-4v62n" (UID: "197d08f0-6e72-49e1-8e05-bc571808c8d3") : failed to sync configmap cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439504 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs podName:0cdb2706-8203-45ab-a34a-668f1d0dd300 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.939433724 +0000 UTC m=+138.078130525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs") pod "machine-config-server-zj9mr" (UID: "0cdb2706-8203-45ab-a34a-668f1d0dd300") : failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439550 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config podName:dbd57385-607f-4585-853f-f7ab3b4dd18d nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.939536547 +0000 UTC m=+138.078233358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config") pod "service-ca-operator-777779d784-pjc2h" (UID: "dbd57385-607f-4585-853f-f7ab3b4dd18d") : failed to sync configmap cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439584 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls podName:5f81c0bb-de86-40af-8412-ceb41bf9478e nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.939569478 +0000 UTC m=+138.078266279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-tv9lz" (UID: "5f81c0bb-de86-40af-8412-ceb41bf9478e") : failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439601 4911 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.439751 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert podName:11919566-9901-4428-a994-c2af062e6b24 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.939688071 +0000 UTC m=+138.078385032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert") pod "ingress-canary-96zzw" (UID: "11919566-9901-4428-a994-c2af062e6b24") : failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.443534 4911 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.443641 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token podName:0cdb2706-8203-45ab-a34a-668f1d0dd300 nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.943618758 +0000 UTC m=+138.082315559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token") pod "machine-config-server-zj9mr" (UID: "0cdb2706-8203-45ab-a34a-668f1d0dd300") : failed to sync secret cache: timed out waiting for the condition Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.453770 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.455631 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.467830 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.470005 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.470276 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.97022669 +0000 UTC m=+138.108923471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.471124 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.471484 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:37.971443473 +0000 UTC m=+138.110140234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.473437 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.494627 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.514767 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.533573 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.554093 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.572760 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.573410 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.073393288 +0000 UTC m=+138.212090059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.576925 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.594651 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.614018 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.633762 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.655753 4911 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.671811 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.674289 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.674353 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.675700 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.175685333 +0000 UTC m=+138.314382104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.693899 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.713648 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.714253 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.733538 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.769903 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.775292 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.775849 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.275781609 +0000 UTC m=+138.414478380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.776488 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.777441 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.277432724 +0000 UTC m=+138.416129495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.782028 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.805242 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.812562 4911 request.go:700] Waited for 1.897113585s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.836620 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgbfm\" (UniqueName: \"kubernetes.io/projected/cf127d54-ebc4-4ac9-9019-0f62813f311d-kube-api-access-pgbfm\") pod \"authentication-operator-69f744f599-zph9g\" (UID: \"cf127d54-ebc4-4ac9-9019-0f62813f311d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.847140 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49qf\" (UniqueName: \"kubernetes.io/projected/d4bf6fe2-fff8-4768-a6bd-157db7bd39cb-kube-api-access-j49qf\") pod \"openshift-config-operator-7777fb866f-tj98q\" (UID: \"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.868069 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvmq\" (UniqueName: \"kubernetes.io/projected/067e922f-4238-434f-b747-e8adb2c478c0-kube-api-access-ldvmq\") pod \"multus-admission-controller-857f4d67dd-6w6x2\" (UID: \"067e922f-4238-434f-b747-e8adb2c478c0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.877331 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.878030 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.377994102 +0000 UTC m=+138.516690883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.878189 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.878919 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.378909246 +0000 UTC m=+138.517606017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.879257 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.892213 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6vn\" (UniqueName: \"kubernetes.io/projected/afd71e8f-dc8d-4906-9c94-e49ba1738d00-kube-api-access-pg6vn\") pod \"dns-operator-744455d44c-j8fnx\" (UID: \"afd71e8f-dc8d-4906-9c94-e49ba1738d00\") " pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.907026 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-78xv9\" (UID: \"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.929003 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4sq\" (UniqueName: \"kubernetes.io/projected/91ebd18d-f20f-4d2c-9607-9fedbf105795-kube-api-access-wc4sq\") pod \"kube-storage-version-migrator-operator-b67b599dd-zprl7\" (UID: \"91ebd18d-f20f-4d2c-9607-9fedbf105795\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.940674 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.947484 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.964171 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.972737 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbhb2\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-kube-api-access-xbhb2\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.979314 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.979969 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980069 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980156 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.980222 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.480188014 +0000 UTC m=+138.618884805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980298 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980357 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980389 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980421 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.980510 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:37 crc kubenswrapper[4911]: E1201 00:09:37.980898 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.480887813 +0000 UTC m=+138.619584584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.983216 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.984091 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd57385-607f-4585-853f-f7ab3b4dd18d-config\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.985832 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f81c0bb-de86-40af-8412-ceb41bf9478e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.987695 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197d08f0-6e72-49e1-8e05-bc571808c8d3-metrics-tls\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.987845 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-node-bootstrap-token\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.989904 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11919566-9901-4428-a994-c2af062e6b24-cert\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:37 crc kubenswrapper[4911]: I1201 00:09:37.992184 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cdb2706-8203-45ab-a34a-668f1d0dd300-certs\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.013310 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzwb\" (UniqueName: \"kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb\") pod \"collect-profiles-29409120-zht4m\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.033332 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8g9p\" (UniqueName: \"kubernetes.io/projected/c65fab7f-8379-4eba-85dd-eb90f2f71100-kube-api-access-f8g9p\") pod \"machine-approver-56656f9798-tl94z\" (UID: \"c65fab7f-8379-4eba-85dd-eb90f2f71100\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.054153 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03a3bd-26ee-4772-9ff5-d802750daa8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bl67v\" (UID: \"8e03a3bd-26ee-4772-9ff5-d802750daa8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.077031 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrts\" (UniqueName: \"kubernetes.io/projected/f4878bb4-63b2-481b-8055-dc5d69809b39-kube-api-access-kvrts\") pod \"machine-api-operator-5694c8668f-4j8rl\" (UID: \"f4878bb4-63b2-481b-8055-dc5d69809b39\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.081081 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tj98q"] Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.081331 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.082383 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.582355136 +0000 UTC m=+138.721051937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.101403 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.106851 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.109990 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gnrs\" (UniqueName: \"kubernetes.io/projected/b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6-kube-api-access-8gnrs\") pod \"cluster-image-registry-operator-dc59b4c8b-4czwh\" (UID: \"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.119970 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lll4l\" (UniqueName: \"kubernetes.io/projected/ed8e11d1-011f-4ff0-8779-f54adbc8d46c-kube-api-access-lll4l\") pod \"machine-config-operator-74547568cd-dxqzj\" (UID: \"ed8e11d1-011f-4ff0-8779-f54adbc8d46c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.130698 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.155162 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.158306 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c86t\" (UniqueName: \"kubernetes.io/projected/3daa9b4f-c005-418c-854c-a81a04ab607a-kube-api-access-5c86t\") pod \"router-default-5444994796-l6g55\" (UID: \"3daa9b4f-c005-418c-854c-a81a04ab607a\") " pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.173808 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zph9g"] Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.179414 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzxf\" (UniqueName: \"kubernetes.io/projected/319f8b92-7275-4984-aaf1-4b04d926062f-kube-api-access-4tzxf\") pod \"migrator-59844c95c7-gsckt\" (UID: \"319f8b92-7275-4984-aaf1-4b04d926062f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.182910 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.183358 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.683336625 +0000 UTC m=+138.822033416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.189646 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrf7\" (UniqueName: \"kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7\") pod \"image-pruner-29409120-hbwz5\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.206951 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.208604 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4vc\" (UniqueName: \"kubernetes.io/projected/28d6f2b1-3fb7-4b11-8e40-7048f385cc5c-kube-api-access-tf4vc\") pod \"apiserver-7bbb656c7d-2v4g2\" (UID: \"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.213604 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.218950 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6w6x2"] Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.230031 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2hm7\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.246980 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqr92\" (UniqueName: \"kubernetes.io/projected/269e7d1c-4918-4344-8427-58a6c9c1d8e7-kube-api-access-gqr92\") pod \"cluster-samples-operator-665b6dd947-jrw9t\" (UID: \"269e7d1c-4918-4344-8427-58a6c9c1d8e7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.252928 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.271997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e968b321-636f-418d-b788-445a0b3cc2a2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c7r5t\" (UID: \"e968b321-636f-418d-b788-445a0b3cc2a2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.272009 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.275413 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" event={"ID":"dd16121c-421a-4466-8cf9-75c9c77e461a","Type":"ContainerStarted","Data":"b9bf4775a866bf33d06556c59526e6e8a7c636ddfcdc315028897ece28c4be55"} Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.276602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" event={"ID":"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e","Type":"ContainerStarted","Data":"2a1b35915b28c5d7230d93ad5465e2e17d2090b546893f6e4e3fb394934d04ac"} Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.283735 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.283864 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.783837242 +0000 UTC m=+138.922534023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.284179 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.284496 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.784487899 +0000 UTC m=+138.923184670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.284514 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.288091 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqspr\" (UniqueName: \"kubernetes.io/projected/0cdb2706-8203-45ab-a34a-668f1d0dd300-kube-api-access-gqspr\") pod \"machine-config-server-zj9mr\" (UID: \"0cdb2706-8203-45ab-a34a-668f1d0dd300\") " pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.292202 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.298418 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.317394 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.319063 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/b88051a1-6f40-46c1-b01e-78de96d4a909-kube-api-access-7frgp\") pod \"csi-hostpathplugin-82mw2\" (UID: \"b88051a1-6f40-46c1-b01e-78de96d4a909\") " pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.331715 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7649r\" (UniqueName: \"kubernetes.io/projected/2f47c16a-94f6-48e8-8757-ffea1b773ec8-kube-api-access-7649r\") pod \"console-operator-58897d9998-96ztf\" (UID: \"2f47c16a-94f6-48e8-8757-ffea1b773ec8\") " pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.367823 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abbd2d3c-0108-407f-9217-b9fbedbf3c1f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pflz5\" (UID: \"abbd2d3c-0108-407f-9217-b9fbedbf3c1f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.370301 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxb68\" (UniqueName: \"kubernetes.io/projected/11919566-9901-4428-a994-c2af062e6b24-kube-api-access-sxb68\") pod \"ingress-canary-96zzw\" (UID: \"11919566-9901-4428-a994-c2af062e6b24\") " pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.382017 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.384288 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.385209 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.385642 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.885609763 +0000 UTC m=+139.024306564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.402420 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cft4r\" (UniqueName: \"kubernetes.io/projected/132c1139-f0a6-4ca4-97db-bc89244c6b26-kube-api-access-cft4r\") pod \"service-ca-9c57cc56f-m2dh9\" (UID: \"132c1139-f0a6-4ca4-97db-bc89244c6b26\") " pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.405784 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.420880 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgx2t\" (UniqueName: \"kubernetes.io/projected/65c9bf23-09ab-404c-acbd-21fb12f9f441-kube-api-access-mgx2t\") pod \"package-server-manager-789f6589d5-5s5s4\" (UID: \"65c9bf23-09ab-404c-acbd-21fb12f9f441\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.434236 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwfm\" (UniqueName: \"kubernetes.io/projected/97b939bc-01a8-41d3-90be-642b0bf45a7b-kube-api-access-bhwfm\") pod \"catalog-operator-68c6474976-jrmgz\" (UID: \"97b939bc-01a8-41d3-90be-642b0bf45a7b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.445045 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.455500 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5fg7\" (UniqueName: \"kubernetes.io/projected/5f81c0bb-de86-40af-8412-ceb41bf9478e-kube-api-access-k5fg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tv9lz\" (UID: \"5f81c0bb-de86-40af-8412-ceb41bf9478e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.458152 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.483191 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxkx\" (UniqueName: \"kubernetes.io/projected/b93f1ee0-0bab-401d-add3-fa49bb88a3dc-kube-api-access-7gxkx\") pod \"olm-operator-6b444d44fb-2vgpp\" (UID: \"b93f1ee0-0bab-401d-add3-fa49bb88a3dc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.488722 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.489132 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:38.989113811 +0000 UTC m=+139.127810592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.489358 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.492227 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.498942 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-96zzw" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.501214 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2t2\" (UniqueName: \"kubernetes.io/projected/0be5141c-ebec-4e7b-9e3e-b5c75a16462d-kube-api-access-gt2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-lslfw\" (UID: \"0be5141c-ebec-4e7b-9e3e-b5c75a16462d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.520776 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.526499 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg7l2\" (UniqueName: \"kubernetes.io/projected/92365c00-de26-4e68-89d2-724cf199e249-kube-api-access-pg7l2\") pod \"downloads-7954f5f757-ckndj\" (UID: \"92365c00-de26-4e68-89d2-724cf199e249\") " pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.539832 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zj9mr" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.548729 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdncd\" (UniqueName: \"kubernetes.io/projected/2722c6c0-aedc-4764-8e11-7a7b5b694151-kube-api-access-rdncd\") pod \"machine-config-controller-84d6567774-7mgdz\" (UID: \"2722c6c0-aedc-4764-8e11-7a7b5b694151\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.559086 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4fkn\" (UniqueName: \"kubernetes.io/projected/017917e8-8480-473b-858b-46626ef5f770-kube-api-access-d4fkn\") pod \"console-f9d7485db-5twfs\" (UID: \"017917e8-8480-473b-858b-46626ef5f770\") " pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.572730 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwn8\" (UniqueName: \"kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8\") pod \"oauth-openshift-558db77b4-9qfhm\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.589604 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.589784 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.089744851 +0000 UTC m=+139.228441662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.589899 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.590404 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.090378878 +0000 UTC m=+139.229075689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.594344 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8w46\" (UniqueName: \"kubernetes.io/projected/dbd57385-607f-4585-853f-f7ab3b4dd18d-kube-api-access-v8w46\") pod \"service-ca-operator-777779d784-pjc2h\" (UID: \"dbd57385-607f-4585-853f-f7ab3b4dd18d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.633377 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb62x\" (UniqueName: \"kubernetes.io/projected/3b4fbe7d-7033-41fc-8304-dd64d5a6f34e-kube-api-access-mb62x\") pod \"apiserver-76f77b778f-jfg64\" (UID: \"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.652028 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.666580 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.678284 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.680273 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hh47\" (UniqueName: \"kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47\") pod \"marketplace-operator-79b997595-66zhz\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.691522 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.692067 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.692324 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.192287922 +0000 UTC m=+139.330984743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.692452 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.693077 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.193052923 +0000 UTC m=+139.331749744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.697998 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.705343 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmjqz\" (UniqueName: \"kubernetes.io/projected/fc7d68bd-5c1a-4e4e-a821-560a9e258d7f-kube-api-access-hmjqz\") pod \"packageserver-d55dfcdfc-9sfv4\" (UID: \"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.719786 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.729199 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.739423 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.755891 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.771805 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.781167 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.794082 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.794285 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.294257639 +0000 UTC m=+139.432954440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.794594 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.795067 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.29505214 +0000 UTC m=+139.433748951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.895271 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.895486 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.395442324 +0000 UTC m=+139.534139105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.895619 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.896100 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.396072431 +0000 UTC m=+139.534769242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.959528 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.996811 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.996988 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.496954368 +0000 UTC m=+139.635651179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:38 crc kubenswrapper[4911]: I1201 00:09:38.997226 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:38 crc kubenswrapper[4911]: E1201 00:09:38.997938 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.497916544 +0000 UTC m=+139.636613325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.098936 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.099081 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.599053257 +0000 UTC m=+139.737750038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.099266 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.099674 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.599661023 +0000 UTC m=+139.738357804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.200671 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.200902 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.700860428 +0000 UTC m=+139.839557240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.201362 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.202198 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.702180004 +0000 UTC m=+139.840876785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.312184 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.313059 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.813031982 +0000 UTC m=+139.951728793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.356761 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/197d08f0-6e72-49e1-8e05-bc571808c8d3-config-volume\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.357792 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grvj\" (UniqueName: \"kubernetes.io/projected/4e11323f-4d92-45ac-86e3-880c8437bfbb-kube-api-access-8grvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-kvdb2\" (UID: \"4e11323f-4d92-45ac-86e3-880c8437bfbb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.356999 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcst9\" (UniqueName: \"kubernetes.io/projected/197d08f0-6e72-49e1-8e05-bc571808c8d3-kube-api-access-rcst9\") pod \"dns-default-4v62n\" (UID: \"197d08f0-6e72-49e1-8e05-bc571808c8d3\") " pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.364481 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.370168 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgss5\" (UniqueName: \"kubernetes.io/projected/fd36e9a2-62fa-4a02-80cd-bbbd26917da5-kube-api-access-fgss5\") pod \"etcd-operator-b45778765-788ks\" (UID: \"fd36e9a2-62fa-4a02-80cd-bbbd26917da5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:39 crc kubenswrapper[4911]: W1201 00:09:39.376947 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4bf6fe2_fff8_4768_a6bd_157db7bd39cb.slice/crio-f77a945e0962b7ef233507295730f897c44762e15d73032bbfa33e4b95f38592 WatchSource:0}: Error finding container f77a945e0962b7ef233507295730f897c44762e15d73032bbfa33e4b95f38592: Status 404 returned error can't find the container with id f77a945e0962b7ef233507295730f897c44762e15d73032bbfa33e4b95f38592 Dec 01 00:09:39 crc kubenswrapper[4911]: W1201 00:09:39.389840 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf127d54_ebc4_4ac9_9019_0f62813f311d.slice/crio-e781f1c60e70e23fe15cd84719a074527c4c7ef9ce0680c6bb4a3674b2baec01 WatchSource:0}: Error finding container e781f1c60e70e23fe15cd84719a074527c4c7ef9ce0680c6bb4a3674b2baec01: Status 404 returned error can't find the container with id e781f1c60e70e23fe15cd84719a074527c4c7ef9ce0680c6bb4a3674b2baec01 Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.415340 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.416133 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:39.916107298 +0000 UTC m=+140.054804149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.431507 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.517145 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.517984 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.017949061 +0000 UTC m=+140.156645852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.546477 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.619540 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.620056 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.12003002 +0000 UTC m=+140.258726791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.654359 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7"] Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.720924 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.721071 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.221047261 +0000 UTC m=+140.359744032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.721685 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.722052 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.222044148 +0000 UTC m=+140.360740919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: W1201 00:09:39.798340 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc65fab7f_8379_4eba_85dd_eb90f2f71100.slice/crio-7417c2cf0a9609e35b4e48c561c57a18ecf4acbbe10b7be76097560c91d28b89 WatchSource:0}: Error finding container 7417c2cf0a9609e35b4e48c561c57a18ecf4acbbe10b7be76097560c91d28b89: Status 404 returned error can't find the container with id 7417c2cf0a9609e35b4e48c561c57a18ecf4acbbe10b7be76097560c91d28b89 Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.824655 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.825065 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.325050532 +0000 UTC m=+140.463747303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:39 crc kubenswrapper[4911]: W1201 00:09:39.832924 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91ebd18d_f20f_4d2c_9607_9fedbf105795.slice/crio-4fdefbd4c02d568ae33c5545eedecdf602c12edc75ef934a9b2699d12905fba4 WatchSource:0}: Error finding container 4fdefbd4c02d568ae33c5545eedecdf602c12edc75ef934a9b2699d12905fba4: Status 404 returned error can't find the container with id 4fdefbd4c02d568ae33c5545eedecdf602c12edc75ef934a9b2699d12905fba4 Dec 01 00:09:39 crc kubenswrapper[4911]: W1201 00:09:39.925921 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3daa9b4f_c005_418c_854c_a81a04ab607a.slice/crio-35c612b3e1a1fddcd413a6a9764150fdce709ca253d96d79485e484cf458eb3c WatchSource:0}: Error finding container 35c612b3e1a1fddcd413a6a9764150fdce709ca253d96d79485e484cf458eb3c: Status 404 returned error can't find the container with id 35c612b3e1a1fddcd413a6a9764150fdce709ca253d96d79485e484cf458eb3c Dec 01 00:09:39 crc kubenswrapper[4911]: I1201 00:09:39.926420 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:39 crc kubenswrapper[4911]: E1201 00:09:39.926816 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.426801082 +0000 UTC m=+140.565497853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.031108 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.032018 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.532002646 +0000 UTC m=+140.670699417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.137323 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.138082 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.638059664 +0000 UTC m=+140.776756435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.238296 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.238569 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.73855435 +0000 UTC m=+140.877251121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.263121 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz"] Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.263183 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v"] Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.263205 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj"] Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.339854 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.340189 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.840175937 +0000 UTC m=+140.978872708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.360076 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" event={"ID":"91ebd18d-f20f-4d2c-9607-9fedbf105795","Type":"ContainerStarted","Data":"4fdefbd4c02d568ae33c5545eedecdf602c12edc75ef934a9b2699d12905fba4"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.374343 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" event={"ID":"cf127d54-ebc4-4ac9-9019-0f62813f311d","Type":"ContainerStarted","Data":"e781f1c60e70e23fe15cd84719a074527c4c7ef9ce0680c6bb4a3674b2baec01"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.375752 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" event={"ID":"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e","Type":"ContainerStarted","Data":"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.378099 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.386733 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" event={"ID":"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb","Type":"ContainerStarted","Data":"f77a945e0962b7ef233507295730f897c44762e15d73032bbfa33e4b95f38592"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.390150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" event={"ID":"067e922f-4238-434f-b747-e8adb2c478c0","Type":"ContainerStarted","Data":"1a746b3f75f9fdbde425fe103d60a88d45701b600a6df1c4d7617bdbe779ff12"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.400770 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.401373 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" event={"ID":"dd16121c-421a-4466-8cf9-75c9c77e461a","Type":"ContainerStarted","Data":"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.401705 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.408997 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l6g55" event={"ID":"3daa9b4f-c005-418c-854c-a81a04ab607a","Type":"ContainerStarted","Data":"35c612b3e1a1fddcd413a6a9764150fdce709ca253d96d79485e484cf458eb3c"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.417844 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.422060 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" event={"ID":"c65fab7f-8379-4eba-85dd-eb90f2f71100","Type":"ContainerStarted","Data":"7417c2cf0a9609e35b4e48c561c57a18ecf4acbbe10b7be76097560c91d28b89"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.432708 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zj9mr" event={"ID":"0cdb2706-8203-45ab-a34a-668f1d0dd300","Type":"ContainerStarted","Data":"1a54dd73e2bc504f2cb5450e63866b12e8d8d76d34d6ee1ff8b4a87d2ab33072"} Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.441769 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.442172 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:40.942158463 +0000 UTC m=+141.080855234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.546059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.546438 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.046424642 +0000 UTC m=+141.185121423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: W1201 00:09:40.578947 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f81c0bb_de86_40af_8412_ceb41bf9478e.slice/crio-50774ec539a7c8a01c833a1ff1125fbc7b8841e54748d8a0cfbd86d071e40ef4 WatchSource:0}: Error finding container 50774ec539a7c8a01c833a1ff1125fbc7b8841e54748d8a0cfbd86d071e40ef4: Status 404 returned error can't find the container with id 50774ec539a7c8a01c833a1ff1125fbc7b8841e54748d8a0cfbd86d071e40ef4 Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.600294 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" podStartSLOduration=118.600280583 podStartE2EDuration="1m58.600280583s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:40.598394102 +0000 UTC m=+140.737090873" watchObservedRunningTime="2025-12-01 00:09:40.600280583 +0000 UTC m=+140.738977354" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.636600 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" podStartSLOduration=119.636452154 podStartE2EDuration="1m59.636452154s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:40.633771342 +0000 UTC m=+140.772468113" watchObservedRunningTime="2025-12-01 00:09:40.636452154 +0000 UTC m=+140.775148925" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.651361 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.651889 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.151872583 +0000 UTC m=+141.290569354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.686591 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zj9mr" podStartSLOduration=5.686569764 podStartE2EDuration="5.686569764s" podCreationTimestamp="2025-12-01 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:40.682814932 +0000 UTC m=+140.821511703" watchObservedRunningTime="2025-12-01 00:09:40.686569764 +0000 UTC m=+140.825266545" Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.757109 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.757705 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.257687873 +0000 UTC m=+141.396384644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.861748 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.862117 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.362101906 +0000 UTC m=+141.500798667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:40 crc kubenswrapper[4911]: I1201 00:09:40.963449 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:40 crc kubenswrapper[4911]: E1201 00:09:40.963878 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.463862486 +0000 UTC m=+141.602559257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.065818 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.066030 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.566002457 +0000 UTC m=+141.704699218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.066133 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.066504 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.566491851 +0000 UTC m=+141.705188612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.167423 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.167612 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.667571523 +0000 UTC m=+141.806268284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.167995 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.168324 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.668311773 +0000 UTC m=+141.807008544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.270001 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.270430 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.770415283 +0000 UTC m=+141.909112054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.372932 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.373285 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.873269883 +0000 UTC m=+142.011966644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.480958 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.481294 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:41.981279873 +0000 UTC m=+142.119976644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.510904 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l6g55" event={"ID":"3daa9b4f-c005-418c-854c-a81a04ab607a","Type":"ContainerStarted","Data":"240ca6762502b34888c444a4343b56d0c441c1f1b4c21976f6d0b66544aa5834"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.531857 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" event={"ID":"8e03a3bd-26ee-4772-9ff5-d802750daa8b","Type":"ContainerStarted","Data":"e3c7fb71ed5975f860c4c276d5eb146ede22affb1dc3fca9f1062af3b272cb4f"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.531901 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" event={"ID":"8e03a3bd-26ee-4772-9ff5-d802750daa8b","Type":"ContainerStarted","Data":"7498061156ab08dfbdda72823be1f90b25712d4fcf24cb2da6cba384f41aecfa"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.531911 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" event={"ID":"8e03a3bd-26ee-4772-9ff5-d802750daa8b","Type":"ContainerStarted","Data":"e5003e03a8feeb65fef002180ab74965422186edabe2a9a5a9b78d217dda81fa"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.568992 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" event={"ID":"c65fab7f-8379-4eba-85dd-eb90f2f71100","Type":"ContainerStarted","Data":"691823be15ee13761e88f1a4d88aa18ec0fa6b3ce154556f764b28ebe5352bd3"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.569041 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" event={"ID":"c65fab7f-8379-4eba-85dd-eb90f2f71100","Type":"ContainerStarted","Data":"2661c1bfe3f25a54cfb19c31e76f396f2dea1a3a03d71dc5619fe83b3df5dc89"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.582613 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.582912 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.08290056 +0000 UTC m=+142.221597321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.599174 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" event={"ID":"ed8e11d1-011f-4ff0-8779-f54adbc8d46c","Type":"ContainerStarted","Data":"b2c9f96b0d804f42e4ccfbd6131c49b3088fc26731e82c5a2a7580904fd0a8b8"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.599216 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" event={"ID":"ed8e11d1-011f-4ff0-8779-f54adbc8d46c","Type":"ContainerStarted","Data":"9ce33317d906088149b2b4d382fcebc22ee46d96bea356d84a75ae1707c37253"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.611585 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tl94z" podStartSLOduration=120.611565418 podStartE2EDuration="2m0.611565418s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.604821405 +0000 UTC m=+141.743518176" watchObservedRunningTime="2025-12-01 00:09:41.611565418 +0000 UTC m=+141.750262199" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.615990 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-l6g55" podStartSLOduration=120.615947567 podStartE2EDuration="2m0.615947567s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.575230652 +0000 UTC m=+141.713927423" watchObservedRunningTime="2025-12-01 00:09:41.615947567 +0000 UTC m=+141.754644338" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.625235 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" event={"ID":"5f81c0bb-de86-40af-8412-ceb41bf9478e","Type":"ContainerStarted","Data":"47691d8032195a751711629335e50d0d29f1915786fd24229208a1269e8b5d46"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.625704 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" event={"ID":"5f81c0bb-de86-40af-8412-ceb41bf9478e","Type":"ContainerStarted","Data":"50774ec539a7c8a01c833a1ff1125fbc7b8841e54748d8a0cfbd86d071e40ef4"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.632714 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bl67v" podStartSLOduration=120.632699471 podStartE2EDuration="2m0.632699471s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.628536898 +0000 UTC m=+141.767233669" watchObservedRunningTime="2025-12-01 00:09:41.632699471 +0000 UTC m=+141.771396242" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.641748 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zj9mr" event={"ID":"0cdb2706-8203-45ab-a34a-668f1d0dd300","Type":"ContainerStarted","Data":"a48df811f0fbb81ac842ff75d42c59c58e25e0ec87276fafdad4451f630a84da"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.658009 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" event={"ID":"067e922f-4238-434f-b747-e8adb2c478c0","Type":"ContainerStarted","Data":"8176fc9b5d8233d4f839515f1a3d7da300f141289d47ff13cb7de993ab7d28b5"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.658054 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" event={"ID":"067e922f-4238-434f-b747-e8adb2c478c0","Type":"ContainerStarted","Data":"345d13939d47c20c866f5af15dd130bd4754f223261ab24dd21df1ff503dcf20"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.686068 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.686270 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.186236384 +0000 UTC m=+142.324933155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.686740 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.686762 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tv9lz" podStartSLOduration=119.686744757 podStartE2EDuration="1m59.686744757s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.663274711 +0000 UTC m=+141.801971482" watchObservedRunningTime="2025-12-01 00:09:41.686744757 +0000 UTC m=+141.825441528" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.688175 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.188163316 +0000 UTC m=+142.326860077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.690624 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp"] Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.698110 4911 generic.go:334] "Generic (PLEG): container finished" podID="d4bf6fe2-fff8-4768-a6bd-157db7bd39cb" containerID="a44de71536b5e6baa9c4d6816339f3c0716e480178fd54fa67c286c7fa139764" exitCode=0 Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.698199 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" event={"ID":"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb","Type":"ContainerDied","Data":"a44de71536b5e6baa9c4d6816339f3c0716e480178fd54fa67c286c7fa139764"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.711920 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" event={"ID":"91ebd18d-f20f-4d2c-9607-9fedbf105795","Type":"ContainerStarted","Data":"15b6d8402b89a2ad35d076358a9d8b6243b6db9818e2e527dd1e28358b7429f3"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.714527 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4"] Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.723220 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6w6x2" podStartSLOduration=119.723198516 podStartE2EDuration="1m59.723198516s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.722784945 +0000 UTC m=+141.861481716" watchObservedRunningTime="2025-12-01 00:09:41.723198516 +0000 UTC m=+141.861895277" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.727093 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" event={"ID":"cf127d54-ebc4-4ac9-9019-0f62813f311d","Type":"ContainerStarted","Data":"e7c464b431f469cb7c38164996d7b0f7e57eebfc24a755a3a3231e4d4fd8a2c4"} Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.736441 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz"] Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.787568 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.788878 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.288861798 +0000 UTC m=+142.427558569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.827682 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zprl7" podStartSLOduration=120.82766555 podStartE2EDuration="2m0.82766555s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.786697559 +0000 UTC m=+141.925394330" watchObservedRunningTime="2025-12-01 00:09:41.82766555 +0000 UTC m=+141.966362321" Dec 01 00:09:41 crc kubenswrapper[4911]: I1201 00:09:41.892480 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:41 crc kubenswrapper[4911]: E1201 00:09:41.896174 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.396157588 +0000 UTC m=+142.534854359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.002721 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.003072 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.503053368 +0000 UTC m=+142.641750139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.109138 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.109788 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.609770014 +0000 UTC m=+142.748466785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.153607 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zph9g" podStartSLOduration=121.153587082 podStartE2EDuration="2m1.153587082s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:41.836334286 +0000 UTC m=+141.975031057" watchObservedRunningTime="2025-12-01 00:09:42.153587082 +0000 UTC m=+142.292283853" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.159033 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j8fnx"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.163789 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfg64"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.193961 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.206055 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29409120-hbwz5"] Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.209650 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2722c6c0_aedc_4764_8e11_7a7b5b694151.slice/crio-dd44c6183fb14d3c37ce90ef7dd3c1d1ef8edc119059cfd620c341727c344707 WatchSource:0}: Error finding container dd44c6183fb14d3c37ce90ef7dd3c1d1ef8edc119059cfd620c341727c344707: Status 404 returned error can't find the container with id dd44c6183fb14d3c37ce90ef7dd3c1d1ef8edc119059cfd620c341727c344707 Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.210193 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.210308 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.710288621 +0000 UTC m=+142.848985392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.210388 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.210765 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.710751363 +0000 UTC m=+142.849448134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.257177 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.271681 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.280636 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9qfhm"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.297740 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.298942 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.313129 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.321886 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.322308 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.822290779 +0000 UTC m=+142.960987540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.334938 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4j8rl"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.335191 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m2dh9"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.343986 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ckndj"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.350514 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-96zzw"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.371354 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82mw2"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.399802 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.401613 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.414850 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.423017 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.423395 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:42.923377081 +0000 UTC m=+143.062073852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.436619 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5twfs"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.453605 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.459574 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.463422 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:42 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:42 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:42 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.463793 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.483797 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4v62n"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.494451 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.515207 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.526111 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.526539 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.02652389 +0000 UTC m=+143.165220661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.536423 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode968b321_636f_418d_b788_445a0b3cc2a2.slice/crio-ed37d879d6eec20e7a6e42ddb590edd94879b9af476ed8fdb542879e437dfed0 WatchSource:0}: Error finding container ed37d879d6eec20e7a6e42ddb590edd94879b9af476ed8fdb542879e437dfed0: Status 404 returned error can't find the container with id ed37d879d6eec20e7a6e42ddb590edd94879b9af476ed8fdb542879e437dfed0 Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.537326 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319f8b92_7275_4984_aaf1_4b04d926062f.slice/crio-929b1e9ab54395682a57ff2ca8f3d05828d31567a64bf37de4d5313bb6777692 WatchSource:0}: Error finding container 929b1e9ab54395682a57ff2ca8f3d05828d31567a64bf37de4d5313bb6777692: Status 404 returned error can't find the container with id 929b1e9ab54395682a57ff2ca8f3d05828d31567a64bf37de4d5313bb6777692 Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.575620 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fae1ac_6d4d_4bcb_abc2_d1495eb7b568.slice/crio-907faf2485bfe4262cf5cee02e0b833418b9c90cabb0dd88dec9beff91708757 WatchSource:0}: Error finding container 907faf2485bfe4262cf5cee02e0b833418b9c90cabb0dd88dec9beff91708757: Status 404 returned error can't find the container with id 907faf2485bfe4262cf5cee02e0b833418b9c90cabb0dd88dec9beff91708757 Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.581694 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96ztf"] Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.621139 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be5141c_ebec_4e7b_9e3e_b5c75a16462d.slice/crio-93336ce0b8d65b66cab27452c74524f1f78be7b905c9379f37d0eb4e453a69f6 WatchSource:0}: Error finding container 93336ce0b8d65b66cab27452c74524f1f78be7b905c9379f37d0eb4e453a69f6: Status 404 returned error can't find the container with id 93336ce0b8d65b66cab27452c74524f1f78be7b905c9379f37d0eb4e453a69f6 Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.627509 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.658453 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.158419268 +0000 UTC m=+143.297116039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.668611 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.686575 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.713515 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-788ks"] Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.731307 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.731869 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.231845069 +0000 UTC m=+143.370541840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.750664 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd65335_e065_4572_9e3d_912fe012056b.slice/crio-57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d WatchSource:0}: Error finding container 57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d: Status 404 returned error can't find the container with id 57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.750733 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" event={"ID":"4e11323f-4d92-45ac-86e3-880c8437bfbb","Type":"ContainerStarted","Data":"f586c2f72bea53d429a3a6f4a0e367ed68bae40671f1237d44ca6230e981e2a1"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.753810 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4v62n" event={"ID":"197d08f0-6e72-49e1-8e05-bc571808c8d3","Type":"ContainerStarted","Data":"355b3c9b7db572293205c68699db79f8276b9424a3870a077a557bcfa40784b0"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.755028 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5twfs" event={"ID":"017917e8-8480-473b-858b-46626ef5f770","Type":"ContainerStarted","Data":"b04646dcb4314640d824ab7d1fb4eeef030afbe108466c0e97fbb645dddd182d"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.758295 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" event={"ID":"a4b27e6c-5803-46ae-ac80-00f249cb714c","Type":"ContainerStarted","Data":"93c9631975f431c8ab23c92738167222dbf27ece32bee1b4614500b9b7bbdd57"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.762443 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" event={"ID":"319f8b92-7275-4984-aaf1-4b04d926062f","Type":"ContainerStarted","Data":"929b1e9ab54395682a57ff2ca8f3d05828d31567a64bf37de4d5313bb6777692"} Dec 01 00:09:42 crc kubenswrapper[4911]: W1201 00:09:42.763815 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d6f2b1_3fb7_4b11_8e40_7048f385cc5c.slice/crio-a273da3d5e114db77c6383542af74628b7e6420b21822a86a45ead1f6740637b WatchSource:0}: Error finding container a273da3d5e114db77c6383542af74628b7e6420b21822a86a45ead1f6740637b: Status 404 returned error can't find the container with id a273da3d5e114db77c6383542af74628b7e6420b21822a86a45ead1f6740637b Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.779490 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" event={"ID":"d4bf6fe2-fff8-4768-a6bd-157db7bd39cb","Type":"ContainerStarted","Data":"8a03c0976c29893aef46450bdf95a68c21dafa83d2afc8fa97f4c364d77be7eb"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.780365 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.784027 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" event={"ID":"0be5141c-ebec-4e7b-9e3e-b5c75a16462d","Type":"ContainerStarted","Data":"93336ce0b8d65b66cab27452c74524f1f78be7b905c9379f37d0eb4e453a69f6"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.812211 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96zzw" event={"ID":"11919566-9901-4428-a994-c2af062e6b24","Type":"ContainerStarted","Data":"1560a68e92978f522269a7eef510767ab43a6f1d3de0f758d5bfbdf27fa348e4"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.836352 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.836833 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.336817827 +0000 UTC m=+143.475514598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.863586 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" event={"ID":"afd71e8f-dc8d-4906-9c94-e49ba1738d00","Type":"ContainerStarted","Data":"4c190def98d2463f6ec9b072b18f6513e8f527c4972c2dc57cf4fc133c954c11"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.902821 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" event={"ID":"2722c6c0-aedc-4764-8e11-7a7b5b694151","Type":"ContainerStarted","Data":"dd3e444cf8463292faccf085fe694131ec4eb912b96fa15b48c53c2a93e1bac0"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.902903 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" event={"ID":"2722c6c0-aedc-4764-8e11-7a7b5b694151","Type":"ContainerStarted","Data":"dd44c6183fb14d3c37ce90ef7dd3c1d1ef8edc119059cfd620c341727c344707"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.933993 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" event={"ID":"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f","Type":"ContainerStarted","Data":"73ffc54b31f4060b7a32c3cb2cbc009ed7b904b03c103a95859a6fe1905638b0"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.934066 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" event={"ID":"fc7d68bd-5c1a-4e4e-a821-560a9e258d7f","Type":"ContainerStarted","Data":"486ca9da804e04d44512a4644752a168e1f67fa3027dbbb241aa55e94a0da3d5"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.936590 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.937118 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:42 crc kubenswrapper[4911]: E1201 00:09:42.938328 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.43830973 +0000 UTC m=+143.577006501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.942245 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" event={"ID":"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568","Type":"ContainerStarted","Data":"907faf2485bfe4262cf5cee02e0b833418b9c90cabb0dd88dec9beff91708757"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.944390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" event={"ID":"f4878bb4-63b2-481b-8055-dc5d69809b39","Type":"ContainerStarted","Data":"103f3a3961ac63275de4a0982370ed642832c6b6802625608c80e0b9d2e80f46"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.945646 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" event={"ID":"dbd57385-607f-4585-853f-f7ab3b4dd18d","Type":"ContainerStarted","Data":"85f5df1c97e5d083199669bf8f47b775d40570b1287f240b81af5a64d31499fc"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.946951 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ckndj" event={"ID":"92365c00-de26-4e68-89d2-724cf199e249","Type":"ContainerStarted","Data":"a34496255b743ff6bddb49e589e18f63de4d0a69282f1099bed637ebc09141af"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.947836 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" event={"ID":"97b939bc-01a8-41d3-90be-642b0bf45a7b","Type":"ContainerStarted","Data":"e696258b628b1bb703cc304531ca531b7e6024348d3413e28f8d2544b3fbc281"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.947855 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" event={"ID":"97b939bc-01a8-41d3-90be-642b0bf45a7b","Type":"ContainerStarted","Data":"4a05774e3782017c9648476d53395c30688d6ea06a5f03b88811a5608daa4f0c"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.948561 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.950043 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" event={"ID":"abbd2d3c-0108-407f-9217-b9fbedbf3c1f","Type":"ContainerStarted","Data":"2b1226fc24d91afd558499ac58d00b4a406d24437d5e2f9dd0f2b736cc47b706"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.951798 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29409120-hbwz5" event={"ID":"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc","Type":"ContainerStarted","Data":"9a1c1af3596612873fd0d0f3462aa0a6065423485922b2e98dfc8ecccb87aefe"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.951816 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29409120-hbwz5" event={"ID":"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc","Type":"ContainerStarted","Data":"adc453bb9ede638da36ad16fcce40682ec7275f7738d1ed52894a4246024caf7"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.953939 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" event={"ID":"85eaed94-1314-4f16-bdf1-a598b183d97c","Type":"ContainerStarted","Data":"852d6fcbccee4a01432ffe9f472351fd63221d71a10d8aed119175670b18cc18"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.955066 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" event={"ID":"269e7d1c-4918-4344-8427-58a6c9c1d8e7","Type":"ContainerStarted","Data":"1b0080fa1b8227dbc125c5db8d3dc987511c52e7fd33b77a9af0958c4d4247bf"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.956711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" event={"ID":"b93f1ee0-0bab-401d-add3-fa49bb88a3dc","Type":"ContainerStarted","Data":"0276cdf49343761bd5ee1c1d45290f44ee3a996a7d5f929447d166befbc3d999"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.956731 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" event={"ID":"b93f1ee0-0bab-401d-add3-fa49bb88a3dc","Type":"ContainerStarted","Data":"040e615c8912f010c4f41f7879a05bc7f5506141218b7535025ae7477e8a0c30"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.957287 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.970423 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" event={"ID":"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e","Type":"ContainerStarted","Data":"7491e4b5a7af8ebee151d0fb0dd4ac9ca45864358abb41cc7fa4384a0aef8df4"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.973744 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" event={"ID":"132c1139-f0a6-4ca4-97db-bc89244c6b26","Type":"ContainerStarted","Data":"470c52fcb867ac9af2ed84a8702078bc399ca6fc45afbc3f25d0ec66a4a7b5fa"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.974417 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" event={"ID":"b88051a1-6f40-46c1-b01e-78de96d4a909","Type":"ContainerStarted","Data":"80688ef4a97ea3eddfdc987c40420b40c90716c9a56da40ffc75e8962c56b2ac"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.975855 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" event={"ID":"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6","Type":"ContainerStarted","Data":"00ce83a6f34d90926e519873536af078fec6233224e07143abd2b5c3f798976b"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.977485 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" event={"ID":"e968b321-636f-418d-b788-445a0b3cc2a2","Type":"ContainerStarted","Data":"ed37d879d6eec20e7a6e42ddb590edd94879b9af476ed8fdb542879e437dfed0"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.978951 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" event={"ID":"65c9bf23-09ab-404c-acbd-21fb12f9f441","Type":"ContainerStarted","Data":"43ee207f2810886c5da097391d92cf63e4926645a6d4ad1c51461fbec8cc8bee"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.979342 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.980929 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" event={"ID":"ed8e11d1-011f-4ff0-8779-f54adbc8d46c","Type":"ContainerStarted","Data":"21bde5380c1096dea821b176a3617ca1c09de2d2867b3e32adc7c5c3c31e0176"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.981102 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.985253 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-96ztf" event={"ID":"2f47c16a-94f6-48e8-8757-ffea1b773ec8","Type":"ContainerStarted","Data":"c1547b7c58224116cb4641ae24f1de8c9a9501289c263d1ae097283486b925a3"} Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.987172 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" podStartSLOduration=120.985905691 podStartE2EDuration="2m0.985905691s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:42.982751956 +0000 UTC m=+143.121448747" watchObservedRunningTime="2025-12-01 00:09:42.985905691 +0000 UTC m=+143.124602462" Dec 01 00:09:42 crc kubenswrapper[4911]: I1201 00:09:42.987372 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" podStartSLOduration=121.987366081 podStartE2EDuration="2m1.987366081s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:42.804800868 +0000 UTC m=+142.943497659" watchObservedRunningTime="2025-12-01 00:09:42.987366081 +0000 UTC m=+143.126062852" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.005302 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dxqzj" podStartSLOduration=121.005272547 podStartE2EDuration="2m1.005272547s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:43.002935823 +0000 UTC m=+143.141632594" watchObservedRunningTime="2025-12-01 00:09:43.005272547 +0000 UTC m=+143.143969318" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.022831 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jrmgz" podStartSLOduration=121.022808962 podStartE2EDuration="2m1.022808962s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:43.021374234 +0000 UTC m=+143.160071005" watchObservedRunningTime="2025-12-01 00:09:43.022808962 +0000 UTC m=+143.161505733" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.038777 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.040443 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.54043047 +0000 UTC m=+143.679127241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.041483 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2vgpp" podStartSLOduration=121.041468449 podStartE2EDuration="2m1.041468449s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:43.040300687 +0000 UTC m=+143.178997458" watchObservedRunningTime="2025-12-01 00:09:43.041468449 +0000 UTC m=+143.180165220" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.059521 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29409120-hbwz5" podStartSLOduration=122.059505308 podStartE2EDuration="2m2.059505308s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:43.058879251 +0000 UTC m=+143.197576022" watchObservedRunningTime="2025-12-01 00:09:43.059505308 +0000 UTC m=+143.198202079" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.141492 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.141685 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.641656717 +0000 UTC m=+143.780353488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.141816 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.144845 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.644837263 +0000 UTC m=+143.783534034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.268409 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.271117 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.768706983 +0000 UTC m=+143.907403754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.271180 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.271616 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.771607052 +0000 UTC m=+143.910303823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.372854 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.373181 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.873166187 +0000 UTC m=+144.011862958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.474181 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.474577 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:43.974564418 +0000 UTC m=+144.113261189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.483545 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:43 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:43 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:43 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.483583 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.499163 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9sfv4" Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.575265 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.575444 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.075413594 +0000 UTC m=+144.214110355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.575731 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.576068 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.076056312 +0000 UTC m=+144.214753083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.676549 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.677446 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.177415981 +0000 UTC m=+144.316112752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.786243 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.786798 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.286779088 +0000 UTC m=+144.425475859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.888098 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.888258 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.388240041 +0000 UTC m=+144.526936812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.888749 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.889331 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.38931446 +0000 UTC m=+144.528011231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.991842 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.992343 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.492327324 +0000 UTC m=+144.631024095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:43 crc kubenswrapper[4911]: I1201 00:09:43.992633 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:43 crc kubenswrapper[4911]: E1201 00:09:43.992975 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.492958851 +0000 UTC m=+144.631655612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.005384 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" event={"ID":"a4b27e6c-5803-46ae-ac80-00f249cb714c","Type":"ContainerStarted","Data":"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.006442 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.007613 4911 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66zhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.44:8080/healthz\": dial tcp 10.217.0.44:8080: connect: connection refused" start-of-body= Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.007686 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.44:8080/healthz\": dial tcp 10.217.0.44:8080: connect: connection refused" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.037350 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" event={"ID":"319f8b92-7275-4984-aaf1-4b04d926062f","Type":"ContainerStarted","Data":"738a598b31b4dcd84b715841153b97893797a3523927e83cf44a97ef06ffdcb0"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.049258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" event={"ID":"e968b321-636f-418d-b788-445a0b3cc2a2","Type":"ContainerStarted","Data":"f08fa1d8757ccddb3c0c1d828e436a5ec0f34900a71fc2ed56512830be1e85b4"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.055437 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" podStartSLOduration=122.055417566 podStartE2EDuration="2m2.055417566s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.054759848 +0000 UTC m=+144.193456619" watchObservedRunningTime="2025-12-01 00:09:44.055417566 +0000 UTC m=+144.194114337" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.061081 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" event={"ID":"f4fae1ac-6d4d-4bcb-abc2-d1495eb7b568","Type":"ContainerStarted","Data":"a1d50e8c590fe3659b85405f3e8dff64e7d316db15b01322ab9cf4cb86914961"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.067537 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" event={"ID":"3dd65335-e065-4572-9e3d-912fe012056b","Type":"ContainerStarted","Data":"57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.072422 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ckndj" event={"ID":"92365c00-de26-4e68-89d2-724cf199e249","Type":"ContainerStarted","Data":"fc2730a88fba6e28546cf83161b6b0bbb24f524351875f27b930f14b6d016b33"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.072844 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.074361 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.074395 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.079823 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" event={"ID":"4e11323f-4d92-45ac-86e3-880c8437bfbb","Type":"ContainerStarted","Data":"1c2c2eb4e6afa95b036179e62197d7b0699d3d86c3fad0c44ff986a579ff929d"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.080954 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c7r5t" podStartSLOduration=123.080938608 podStartE2EDuration="2m3.080938608s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.078874752 +0000 UTC m=+144.217571523" watchObservedRunningTime="2025-12-01 00:09:44.080938608 +0000 UTC m=+144.219635379" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.083079 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4v62n" event={"ID":"197d08f0-6e72-49e1-8e05-bc571808c8d3","Type":"ContainerStarted","Data":"591698e911a30e9ab34d94a6969e5ce2dde0ca9a3aaa8a3ea6383bcabfb56b9c"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.094289 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.094448 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.594429464 +0000 UTC m=+144.733126235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.094563 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.094925 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.594913817 +0000 UTC m=+144.733610588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.101569 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" event={"ID":"f4878bb4-63b2-481b-8055-dc5d69809b39","Type":"ContainerStarted","Data":"e0486abc26f52f1a040f3743874b2b827c584c0465b8aaeb6e7c3f42a34d7de9"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.122602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" event={"ID":"abbd2d3c-0108-407f-9217-b9fbedbf3c1f","Type":"ContainerStarted","Data":"2d48a77a5fef6d793be060e77c29403fa1eccfb4bfa3039f7a84c43b5f47599b"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.123732 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-78xv9" podStartSLOduration=123.123713709 podStartE2EDuration="2m3.123713709s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.119896225 +0000 UTC m=+144.258592996" watchObservedRunningTime="2025-12-01 00:09:44.123713709 +0000 UTC m=+144.262410480" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.124704 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ckndj" podStartSLOduration=123.124699225 podStartE2EDuration="2m3.124699225s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.092383989 +0000 UTC m=+144.231080760" watchObservedRunningTime="2025-12-01 00:09:44.124699225 +0000 UTC m=+144.263395996" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.139344 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kvdb2" podStartSLOduration=123.139326522 podStartE2EDuration="2m3.139326522s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.138293194 +0000 UTC m=+144.276989965" watchObservedRunningTime="2025-12-01 00:09:44.139326522 +0000 UTC m=+144.278023293" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.146602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" event={"ID":"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c","Type":"ContainerStarted","Data":"a273da3d5e114db77c6383542af74628b7e6420b21822a86a45ead1f6740637b"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.172338 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" event={"ID":"65c9bf23-09ab-404c-acbd-21fb12f9f441","Type":"ContainerStarted","Data":"8b46d34ac2a3d1ef8de62c3d1c79bdf8049b7d9bb6e10be9902dd58c74570544"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.185245 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" event={"ID":"afd71e8f-dc8d-4906-9c94-e49ba1738d00","Type":"ContainerStarted","Data":"93baaa1f971e1bbc1439e0c53feab09c23630021f1a8539fd2184aa6573ae1bf"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.191886 4911 generic.go:334] "Generic (PLEG): container finished" podID="3b4fbe7d-7033-41fc-8304-dd64d5a6f34e" containerID="7cbcf6fdb4f3d7dd312c991ff6b7de918e6a4ed3a6352b086231edb1c46b3c62" exitCode=0 Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.191944 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" event={"ID":"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e","Type":"ContainerDied","Data":"7cbcf6fdb4f3d7dd312c991ff6b7de918e6a4ed3a6352b086231edb1c46b3c62"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.195621 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.197255 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.697238073 +0000 UTC m=+144.835934854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.294763 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" event={"ID":"fd36e9a2-62fa-4a02-80cd-bbbd26917da5","Type":"ContainerStarted","Data":"870d65b8834c88a7db13bef4e946a0f638035d8442ef0df4975252425c0576c4"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.299261 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.299599 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.79958578 +0000 UTC m=+144.938282551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.307279 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-96zzw" event={"ID":"11919566-9901-4428-a994-c2af062e6b24","Type":"ContainerStarted","Data":"ad5de892d3140a5e94cd1abf2ee646a0560ec5ee1471fd0dd74a850977d9dae7"} Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.325110 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tj98q" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.331361 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pflz5" podStartSLOduration=123.331333641 podStartE2EDuration="2m3.331333641s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.160136387 +0000 UTC m=+144.298833148" watchObservedRunningTime="2025-12-01 00:09:44.331333641 +0000 UTC m=+144.470030412" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.332357 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-96zzw" podStartSLOduration=9.332346429 podStartE2EDuration="9.332346429s" podCreationTimestamp="2025-12-01 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:44.331751953 +0000 UTC m=+144.470448724" watchObservedRunningTime="2025-12-01 00:09:44.332346429 +0000 UTC m=+144.471043200" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.347416 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.353756 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.355659 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.358395 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.458510 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.458859 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.458980 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.459178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82kq\" (UniqueName: \"kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.460239 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:44.960206377 +0000 UTC m=+145.098903148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.467376 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:44 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:44 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:44 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.467423 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.573922 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.574135 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.574234 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82kq\" (UniqueName: \"kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.574266 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.575102 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.575188 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.575424 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.075407543 +0000 UTC m=+145.214104314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.677228 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.677560 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.177542623 +0000 UTC m=+145.316239394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.687405 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.688525 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.692054 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.709943 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82kq\" (UniqueName: \"kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq\") pod \"community-operators-7k7j2\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.746968 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.780138 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6l6c\" (UniqueName: \"kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.780178 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.780222 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.780300 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.780587 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.280576348 +0000 UTC m=+145.419273119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.881156 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.881795 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.881841 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6l6c\" (UniqueName: \"kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.881862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.882412 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.882508 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.382493683 +0000 UTC m=+145.521190454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.883074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.915882 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.916913 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.923654 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.934735 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.940714 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6l6c\" (UniqueName: \"kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c\") pod \"certified-operators-f9xb9\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.953570 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.959014 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.975242 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:09:44 crc kubenswrapper[4911]: I1201 00:09:44.989108 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:44 crc kubenswrapper[4911]: E1201 00:09:44.989672 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.48964931 +0000 UTC m=+145.628346081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091196 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091379 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091428 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091476 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091496 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb88w\" (UniqueName: \"kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091534 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.091566 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5f2v\" (UniqueName: \"kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.091729 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.591712189 +0000 UTC m=+145.730408960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205155 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5f2v\" (UniqueName: \"kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205282 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205318 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205342 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205360 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb88w\" (UniqueName: \"kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.205392 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.206579 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.206828 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.207193 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.707175332 +0000 UTC m=+145.845872103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.207188 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.207477 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.265275 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb88w\" (UniqueName: \"kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w\") pod \"certified-operators-8n99g\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.265348 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5f2v\" (UniqueName: \"kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v\") pod \"community-operators-pntq8\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.315891 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.316036 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.816017114 +0000 UTC m=+145.954713885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.316139 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.316421 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.816415015 +0000 UTC m=+145.955111786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.338743 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.431142 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.432091 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:45.932067063 +0000 UTC m=+146.070763834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.458099 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5twfs" event={"ID":"017917e8-8480-473b-858b-46626ef5f770","Type":"ContainerStarted","Data":"c3ef8f1d6bfd8f07d6e056ed0310bd71aea1285437f31aa9214bf5a90888ee48"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.479296 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" event={"ID":"dbd57385-607f-4585-853f-f7ab3b4dd18d","Type":"ContainerStarted","Data":"ec20effaf3622c762f86d70587401a0fcd47e8187f11ead3ecef2abcfffe3a74"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.481948 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" event={"ID":"0be5141c-ebec-4e7b-9e3e-b5c75a16462d","Type":"ContainerStarted","Data":"121ee754f7ad8d9dd67fbe564873b75b59aa55c4f183503d6f069b37f6df8906"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.519560 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:45 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:45 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:45 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.519613 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.538430 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.540313 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.040300619 +0000 UTC m=+146.178997390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.576528 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.580138 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5twfs" podStartSLOduration=124.580114649 podStartE2EDuration="2m4.580114649s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.542236881 +0000 UTC m=+145.680933652" watchObservedRunningTime="2025-12-01 00:09:45.580114649 +0000 UTC m=+145.718811420" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.609349 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" event={"ID":"2722c6c0-aedc-4764-8e11-7a7b5b694151","Type":"ContainerStarted","Data":"deb6aabe6725083b18ce80c35212e51fd36c28a7ae66123e07963b8f85f381f8"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.633229 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" event={"ID":"132c1139-f0a6-4ca4-97db-bc89244c6b26","Type":"ContainerStarted","Data":"2194fff2a49a203cd5884cba292878d29120f560c737cdcd2f3fbf2e41d0dacc"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.635984 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lslfw" podStartSLOduration=124.635963344 podStartE2EDuration="2m4.635963344s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.614974345 +0000 UTC m=+145.753671336" watchObservedRunningTime="2025-12-01 00:09:45.635963344 +0000 UTC m=+145.774660115" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.640813 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.641208 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.141192236 +0000 UTC m=+146.279889007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.651175 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pjc2h" podStartSLOduration=123.651159746 podStartE2EDuration="2m3.651159746s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.650337104 +0000 UTC m=+145.789033875" watchObservedRunningTime="2025-12-01 00:09:45.651159746 +0000 UTC m=+145.789856517" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.691984 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" event={"ID":"b9ca8e8d-bfc8-4a18-b6a2-3d895d7de5f6","Type":"ContainerStarted","Data":"b883716d5b4187252bdc54e2a37739bc884aaf4eb337cf2dfc9e522c09f1b05b"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.703052 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" event={"ID":"85eaed94-1314-4f16-bdf1-a598b183d97c","Type":"ContainerStarted","Data":"dfd45c213e8bef5e2d17c16d2a055328d9585e7bfe9ae8285e3685b04f291f7d"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.705179 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.708687 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" event={"ID":"269e7d1c-4918-4344-8427-58a6c9c1d8e7","Type":"ContainerStarted","Data":"221f7f8a8f862d85b579c7a0f00181890d312b723662287a64871a73825338ab"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.712937 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.713064 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7mgdz" podStartSLOduration=123.713042495 podStartE2EDuration="2m3.713042495s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.67378156 +0000 UTC m=+145.812478331" watchObservedRunningTime="2025-12-01 00:09:45.713042495 +0000 UTC m=+145.851739266" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.715221 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m2dh9" podStartSLOduration=123.715215304 podStartE2EDuration="2m3.715215304s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.713564009 +0000 UTC m=+145.852260780" watchObservedRunningTime="2025-12-01 00:09:45.715215304 +0000 UTC m=+145.853912075" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.718186 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-96ztf" event={"ID":"2f47c16a-94f6-48e8-8757-ffea1b773ec8","Type":"ContainerStarted","Data":"6f309843473ce216a74e073447f0f6c1a6ff94fa221ccf82f49c5ae8be6401d9"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.719128 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.754108 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" event={"ID":"b88051a1-6f40-46c1-b01e-78de96d4a909","Type":"ContainerStarted","Data":"f75ce412bb90a9de0021bb08e1e7622ab9b03b8a4f1a3eb819e447f0d3327e86"} Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.757242 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.757293 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.764798 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.767884 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.267866022 +0000 UTC m=+146.406562793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.818128 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4czwh" podStartSLOduration=124.818094675 podStartE2EDuration="2m4.818094675s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.772780106 +0000 UTC m=+145.911476887" watchObservedRunningTime="2025-12-01 00:09:45.818094675 +0000 UTC m=+145.956791446" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.821440 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" podStartSLOduration=124.821432256 podStartE2EDuration="2m4.821432256s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.797658851 +0000 UTC m=+145.936355622" watchObservedRunningTime="2025-12-01 00:09:45.821432256 +0000 UTC m=+145.960129027" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.822334 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-96ztf" podStartSLOduration=124.82232844 podStartE2EDuration="2m4.82232844s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:45.811372743 +0000 UTC m=+145.950069514" watchObservedRunningTime="2025-12-01 00:09:45.82232844 +0000 UTC m=+145.961025211" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.837077 4911 patch_prober.go:28] interesting pod/console-operator-58897d9998-96ztf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.848160 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-96ztf" podUID="2f47c16a-94f6-48e8-8757-ffea1b773ec8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.838447 4911 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9qfhm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.33:6443/healthz\": dial tcp 10.217.0.33:6443: connect: connection refused" start-of-body= Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.848753 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.33:6443/healthz\": dial tcp 10.217.0.33:6443: connect: connection refused" Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.938335 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.945624 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.445590424 +0000 UTC m=+146.584287195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.954791 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:45 crc kubenswrapper[4911]: E1201 00:09:45.955315 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.455302097 +0000 UTC m=+146.593998868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:45 crc kubenswrapper[4911]: I1201 00:09:45.957736 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.055572 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.056373 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.556347979 +0000 UTC m=+146.695044750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.056436 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.056978 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.55677892 +0000 UTC m=+146.695475691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.426328 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.427444 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:46.927425235 +0000 UTC m=+147.066122006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.468893 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:46 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:46 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:46 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.468945 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.528227 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.528596 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.028582119 +0000 UTC m=+147.167278890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.552827 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.560516 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.560582 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.560711 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.577689 4911 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.578082 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.694342 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.694691 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.694713 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.694753 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjbh\" (UniqueName: \"kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.694837 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.194823509 +0000 UTC m=+147.333520280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.727553 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.811414 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.811481 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.811531 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjbh\" (UniqueName: \"kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.811574 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.811878 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.311867654 +0000 UTC m=+147.450564425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.813040 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.813267 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.825003 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" event={"ID":"afd71e8f-dc8d-4906-9c94-e49ba1738d00","Type":"ContainerStarted","Data":"59bd62f28a2459220f4c7f5c6d94d3cc5834ba37a6d3ce977509a3784af0ebda"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.849119 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" event={"ID":"319f8b92-7275-4984-aaf1-4b04d926062f","Type":"ContainerStarted","Data":"a88145434b1b036ee52e8a5e8d127354d796e4f789fcdef31b387320ed840d42"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.851183 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4v62n" event={"ID":"197d08f0-6e72-49e1-8e05-bc571808c8d3","Type":"ContainerStarted","Data":"092bfad353054e7c7a157034beb3efa926c94a5025ddc9b76c6a927298daa970"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.851554 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.852579 4911 generic.go:334] "Generic (PLEG): container finished" podID="28d6f2b1-3fb7-4b11-8e40-7048f385cc5c" containerID="d8d9cceb7a0850f5d3ade80c846b99cc5ade0b0f32661724112939e8c56e703b" exitCode=0 Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.852624 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" event={"ID":"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c","Type":"ContainerDied","Data":"d8d9cceb7a0850f5d3ade80c846b99cc5ade0b0f32661724112939e8c56e703b"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.854608 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" event={"ID":"65c9bf23-09ab-404c-acbd-21fb12f9f441","Type":"ContainerStarted","Data":"36a5daafa5106af7a9e7f78de17cc07ffbc29714f804f021d001716244cd32f8"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.859048 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.864795 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.866092 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.867129 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.871125 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.899518 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjbh\" (UniqueName: \"kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh\") pod \"redhat-marketplace-5drp7\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.919176 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j8fnx" podStartSLOduration=125.919162665 podStartE2EDuration="2m5.919162665s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:46.916434061 +0000 UTC m=+147.055130832" watchObservedRunningTime="2025-12-01 00:09:46.919162665 +0000 UTC m=+147.057859436" Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.920194 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.921748 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" event={"ID":"3dd65335-e065-4572-9e3d-912fe012056b","Type":"ContainerStarted","Data":"27aebb5f490bef73029556fb30a80f363cc8f09346152087a96aee6d51687f4d"} Dec 01 00:09:46 crc kubenswrapper[4911]: E1201 00:09:46.934050 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.434015558 +0000 UTC m=+147.572712329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.936898 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" event={"ID":"b88051a1-6f40-46c1-b01e-78de96d4a909","Type":"ContainerStarted","Data":"0429219637db5cf573bedbc8caf04bb489140d94e2f266a0e65600aef8d94aff"} Dec 01 00:09:46 crc kubenswrapper[4911]: I1201 00:09:46.966054 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" event={"ID":"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e","Type":"ContainerStarted","Data":"c00b72e26e71e69d129c040e769ca716c24ce736514074090ce9cb615dee6efe"} Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.059310 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgbkc\" (UniqueName: \"kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.059377 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.059414 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.059579 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: E1201 00:09:47.060881 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.560861679 +0000 UTC m=+147.699558450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.090959 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" event={"ID":"fd36e9a2-62fa-4a02-80cd-bbbd26917da5","Type":"ContainerStarted","Data":"e7ace7b358a8762346abc6f8e7c0ebd9831d6fe13f2dad4115132bda33cb2b34"} Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.096518 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gsckt" podStartSLOduration=126.096498126 podStartE2EDuration="2m6.096498126s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.095744955 +0000 UTC m=+147.234441716" watchObservedRunningTime="2025-12-01 00:09:47.096498126 +0000 UTC m=+147.235194897" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.096720 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4v62n" podStartSLOduration=12.096715212 podStartE2EDuration="12.096715212s" podCreationTimestamp="2025-12-01 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:46.997699636 +0000 UTC m=+147.136396407" watchObservedRunningTime="2025-12-01 00:09:47.096715212 +0000 UTC m=+147.235411983" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.145613 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" event={"ID":"f4878bb4-63b2-481b-8055-dc5d69809b39","Type":"ContainerStarted","Data":"5eceef6d0d35d6a520be85cef6630331695e793c9266a57823730b5b95faee32"} Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.147788 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" podStartSLOduration=125.147771807 podStartE2EDuration="2m5.147771807s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.120059295 +0000 UTC m=+147.258756056" watchObservedRunningTime="2025-12-01 00:09:47.147771807 +0000 UTC m=+147.286468578" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.168448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:47 crc kubenswrapper[4911]: E1201 00:09:47.168754 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.668714015 +0000 UTC m=+147.807410786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.168818 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgbkc\" (UniqueName: \"kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.168932 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.169003 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.169205 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.169715 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.170401 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: E1201 00:09:47.170751 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.67074297 +0000 UTC m=+147.809439741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.218112 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.238692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgbkc\" (UniqueName: \"kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc\") pod \"redhat-marketplace-txvnj\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.270577 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:47 crc kubenswrapper[4911]: E1201 00:09:47.271485 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.771453202 +0000 UTC m=+147.910149973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.276584 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" event={"ID":"269e7d1c-4918-4344-8427-58a6c9c1d8e7","Type":"ContainerStarted","Data":"71c38b9b7e10249926774be3a43943ad096ecad6962e9b098dd1cfb1765e6db7"} Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.287492 4911 patch_prober.go:28] interesting pod/console-operator-58897d9998-96ztf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.287570 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-96ztf" podUID="2f47c16a-94f6-48e8-8757-ffea1b773ec8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.291570 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4j8rl" podStartSLOduration=125.291548687 podStartE2EDuration="2m5.291548687s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.252849257 +0000 UTC m=+147.391546028" watchObservedRunningTime="2025-12-01 00:09:47.291548687 +0000 UTC m=+147.430245448" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.304435 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" podStartSLOduration=126.304411826 podStartE2EDuration="2m6.304411826s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.293266924 +0000 UTC m=+147.431963695" watchObservedRunningTime="2025-12-01 00:09:47.304411826 +0000 UTC m=+147.443108597" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.315776 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.340710 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-788ks" podStartSLOduration=126.340689891 podStartE2EDuration="2m6.340689891s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.334869733 +0000 UTC m=+147.473566494" watchObservedRunningTime="2025-12-01 00:09:47.340689891 +0000 UTC m=+147.479386662" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.341969 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.403221 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.407652 4911 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T00:09:46.577725632Z","Handler":null,"Name":""} Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.409383 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:47 crc kubenswrapper[4911]: E1201 00:09:47.409873 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 00:09:47.909858847 +0000 UTC m=+148.048555618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9hgb" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.452549 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jrw9t" podStartSLOduration=126.45252691499999 podStartE2EDuration="2m6.452526915s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:47.450407067 +0000 UTC m=+147.589103838" watchObservedRunningTime="2025-12-01 00:09:47.452526915 +0000 UTC m=+147.591223686" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.475999 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:47 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:47 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:47 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.476085 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.513388 4911 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.513539 4911 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.531129 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.614924 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.678200 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.692661 4911 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 00:09:47 crc kubenswrapper[4911]: I1201 00:09:47.692717 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.127305 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.128530 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.145437 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.185855 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9hgb\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.200013 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.212304 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.215502 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.269771 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.313765 4911 generic.go:334] "Generic (PLEG): container finished" podID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerID="439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa" exitCode=0 Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.313835 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerDied","Data":"439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.313866 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerStarted","Data":"fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.320688 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.321937 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.323719 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerStarted","Data":"be9bb614e9fc64cd82cc3ae81843e2fa9a31a47a5d99383321241d6e7c819914"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335531 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335577 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335645 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjdvz\" (UniqueName: \"kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335675 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335707 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmvs\" (UniqueName: \"kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.335740 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.338166 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" event={"ID":"28d6f2b1-3fb7-4b11-8e40-7048f385cc5c","Type":"ContainerStarted","Data":"b3d283d262ec27becec78dc59dc733580689cce180bac5b095bbc7451ab278b9"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.354865 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerStarted","Data":"b4892b0e3d1354bc6307ab036afc8d5481e5dc2905b18f24dcb1d1903b217a22"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.392647 4911 generic.go:334] "Generic (PLEG): container finished" podID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerID="8656b9465f538d0005434d71581f070b621473bfa8e12ec2194d58ed58ad433f" exitCode=0 Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.392711 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerDied","Data":"8656b9465f538d0005434d71581f070b621473bfa8e12ec2194d58ed58ad433f"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.392735 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerStarted","Data":"85fc4ccccdb0542b067e66b77ca56a620eb7363260e1029201f8ce60a6616d03"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.398249 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" event={"ID":"b88051a1-6f40-46c1-b01e-78de96d4a909","Type":"ContainerStarted","Data":"d8c826b650687ab01e33f3cce3c8753f28c705fcf46efa2be52e47d8e66cd859"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.419956 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" event={"ID":"3b4fbe7d-7033-41fc-8304-dd64d5a6f34e","Type":"ContainerStarted","Data":"1557eca6f2fd723a82fa7ce529bd78a55b5358c42b9c085e9b513f9317e013b3"} Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.427499 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-96ztf" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.439910 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmvs\" (UniqueName: \"kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440000 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440042 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440074 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440132 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440160 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440188 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjdvz\" (UniqueName: \"kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.440245 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.448931 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.450791 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.457708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.459113 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.459657 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.469538 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.469848 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.472521 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.476033 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:48 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:48 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:48 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.476080 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.493775 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.494057 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.495285 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmvs\" (UniqueName: \"kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs\") pod \"redhat-operators-ffmzr\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.498149 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjdvz\" (UniqueName: \"kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz\") pod \"redhat-operators-cm29t\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.511058 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.521697 4911 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2v4g2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.521760 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" podUID="28d6f2b1-3fb7-4b11-8e40-7048f385cc5c" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.542415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.542507 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.572228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.578853 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.582384 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" podStartSLOduration=126.581208564 podStartE2EDuration="2m6.581208564s" podCreationTimestamp="2025-12-01 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:48.564598264 +0000 UTC m=+148.703295045" watchObservedRunningTime="2025-12-01 00:09:48.581208564 +0000 UTC m=+148.719905335" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.595409 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.657144 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.677470 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.697136 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.697837 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.697869 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.698112 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.698198 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.710087 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.718258 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" podStartSLOduration=127.718242022 podStartE2EDuration="2m7.718242022s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:48.716561456 +0000 UTC m=+148.855258227" watchObservedRunningTime="2025-12-01 00:09:48.718242022 +0000 UTC m=+148.856938793" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.730524 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.740576 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.740877 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.749052 4911 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jfg64 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.31:8443/livez\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.749099 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" podUID="3b4fbe7d-7033-41fc-8304-dd64d5a6f34e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.31:8443/livez\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.757317 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.757347 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.772948 4911 patch_prober.go:28] interesting pod/console-f9d7485db-5twfs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.772996 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5twfs" podUID="017917e8-8480-473b-858b-46626ef5f770" containerName="console" probeResult="failure" output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.867224 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.868611 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.875448 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.875670 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.952400 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.968203 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:48 crc kubenswrapper[4911]: I1201 00:09:48.968263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.069820 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.069889 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.069966 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.299797 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.347193 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.426094 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerID="aebfb7e3291fb2fe1d4d6f6430f106a12796b89337cd97706899ac7445bd2253" exitCode=0 Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.427867 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerDied","Data":"aebfb7e3291fb2fe1d4d6f6430f106a12796b89337cd97706899ac7445bd2253"} Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.428893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerStarted","Data":"4e16867a0d18af888a66e57c5f8e67ab50526e028c9d5a3e32df6c4b83f740d0"} Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.438783 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerID="45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad" exitCode=0 Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.438960 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerDied","Data":"45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad"} Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.440919 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerStarted","Data":"b50dafb721de51cc1ddd0818365b29bbd3a73b6d5211cd020f34fd4f27752282"} Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.499905 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.633982 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:49 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:49 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:49 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:49 crc kubenswrapper[4911]: I1201 00:09:49.634037 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.393439 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.542610 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.559426 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:50 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:50 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:50 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.559523 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.610354 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" event={"ID":"b88051a1-6f40-46c1-b01e-78de96d4a909","Type":"ContainerStarted","Data":"f5f68847c6c3f36105c4c9061f41af687a076e19084c83a54be339656ad4553b"} Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.628296 4911 generic.go:334] "Generic (PLEG): container finished" podID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerID="8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493" exitCode=0 Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.628363 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerDied","Data":"8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493"} Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.633656 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" event={"ID":"b2bca0c5-b712-4648-a9a8-34543b89d5db","Type":"ContainerStarted","Data":"da54cf183ded94c80f50e62fcbde1af8279e3335f2f9568b845c7ba0406241f7"} Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.653423 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-82mw2" podStartSLOduration=15.65339812 podStartE2EDuration="15.65339812s" podCreationTimestamp="2025-12-01 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:50.652257399 +0000 UTC m=+150.790954180" watchObservedRunningTime="2025-12-01 00:09:50.65339812 +0000 UTC m=+150.792094911" Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.661722 4911 generic.go:334] "Generic (PLEG): container finished" podID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerID="d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b" exitCode=0 Dec 01 00:09:50 crc kubenswrapper[4911]: I1201 00:09:50.662920 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerDied","Data":"d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.140926 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 00:09:51 crc kubenswrapper[4911]: W1201 00:09:51.182064 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2963529_140d_4683_bc00_448f1aa1ed51.slice/crio-d95a39fcda25084eb52a52185f28b46f93ed5ca54ad78a5b6fb4c19d43679c05 WatchSource:0}: Error finding container d95a39fcda25084eb52a52185f28b46f93ed5ca54ad78a5b6fb4c19d43679c05: Status 404 returned error can't find the container with id d95a39fcda25084eb52a52185f28b46f93ed5ca54ad78a5b6fb4c19d43679c05 Dec 01 00:09:51 crc kubenswrapper[4911]: W1201 00:09:51.188993 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded713643_05a3_45af_a821_b053054528dd.slice/crio-909e66524f4aa7cf79141068f5d404fb0703c6164688c88358f7cb389acb2ac2 WatchSource:0}: Error finding container 909e66524f4aa7cf79141068f5d404fb0703c6164688c88358f7cb389acb2ac2: Status 404 returned error can't find the container with id 909e66524f4aa7cf79141068f5d404fb0703c6164688c88358f7cb389acb2ac2 Dec 01 00:09:51 crc kubenswrapper[4911]: W1201 00:09:51.190386 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-260df2f21b83dc3b5e85230b3b5548fb3dc6aa1c8f9e7b05f07c7ae5dc2d1df8 WatchSource:0}: Error finding container 260df2f21b83dc3b5e85230b3b5548fb3dc6aa1c8f9e7b05f07c7ae5dc2d1df8: Status 404 returned error can't find the container with id 260df2f21b83dc3b5e85230b3b5548fb3dc6aa1c8f9e7b05f07c7ae5dc2d1df8 Dec 01 00:09:51 crc kubenswrapper[4911]: W1201 00:09:51.202395 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-138990213d217a7860ecc6619992494c37c43a3f5963f1b683d21b69f8cc39a1 WatchSource:0}: Error finding container 138990213d217a7860ecc6619992494c37c43a3f5963f1b683d21b69f8cc39a1: Status 404 returned error can't find the container with id 138990213d217a7860ecc6619992494c37c43a3f5963f1b683d21b69f8cc39a1 Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.312421 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.312495 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.477127 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:51 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:51 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:51 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.477536 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.668392 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7577501b-be27-4bc0-93fd-2858e7d770bc","Type":"ContainerStarted","Data":"137b7058b0e6f3d3c9d48bf4085e600bc55350a4239a98a9cc20bdcb3a060dcb"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.672206 4911 generic.go:334] "Generic (PLEG): container finished" podID="3dd65335-e065-4572-9e3d-912fe012056b" containerID="27aebb5f490bef73029556fb30a80f363cc8f09346152087a96aee6d51687f4d" exitCode=0 Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.672252 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" event={"ID":"3dd65335-e065-4572-9e3d-912fe012056b","Type":"ContainerDied","Data":"27aebb5f490bef73029556fb30a80f363cc8f09346152087a96aee6d51687f4d"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.674390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerStarted","Data":"909e66524f4aa7cf79141068f5d404fb0703c6164688c88358f7cb389acb2ac2"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.675478 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerStarted","Data":"d95a39fcda25084eb52a52185f28b46f93ed5ca54ad78a5b6fb4c19d43679c05"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.678776 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"138990213d217a7860ecc6619992494c37c43a3f5963f1b683d21b69f8cc39a1"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.718916 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" event={"ID":"b2bca0c5-b712-4648-a9a8-34543b89d5db","Type":"ContainerStarted","Data":"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.719723 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.724617 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"369ba6bb7c67ec2a451bd440dcb7b2c59bc69cb7d6f338e0162f4ab5d8a3be86"} Dec 01 00:09:51 crc kubenswrapper[4911]: I1201 00:09:51.799420 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"260df2f21b83dc3b5e85230b3b5548fb3dc6aa1c8f9e7b05f07c7ae5dc2d1df8"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.462497 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:52 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:52 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:52 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.462854 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.680095 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" podStartSLOduration=131.680066101 podStartE2EDuration="2m11.680066101s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:51.797021465 +0000 UTC m=+151.935718236" watchObservedRunningTime="2025-12-01 00:09:52.680066101 +0000 UTC m=+152.818762862" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.682489 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.683189 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.685158 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.687050 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.687696 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.774598 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.775089 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.808192 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"37ae7528c20f3cba1a86c94fa18031f17b4cd9a4a91ccd3b839a314b2308980d"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.815868 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"83d3585b8d62c58b38cde7255c21fdcf431a71102d2ff697a6e8e4c7788642e9"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.832919 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"229e8977ad196e720ab5772b5bd0d6023152ec30b2f397bcf0a245cb65500b92"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.833061 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.836643 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7577501b-be27-4bc0-93fd-2858e7d770bc","Type":"ContainerStarted","Data":"a60f7013ecda7e38b0839c4d1b089d13914d00aa85295c0f1118cab4a077fbcc"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.876868 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.876927 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.877223 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed713643-05a3-45af-a821-b053054528dd" containerID="e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08" exitCode=0 Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.877308 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.877407 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerDied","Data":"e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.880828 4911 generic.go:334] "Generic (PLEG): container finished" podID="a2963529-140d-4683-bc00-448f1aa1ed51" containerID="1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620" exitCode=0 Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.884331 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerDied","Data":"1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620"} Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.898332 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:52 crc kubenswrapper[4911]: I1201 00:09:52.921523 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.92149803 podStartE2EDuration="4.92149803s" podCreationTimestamp="2025-12-01 00:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:09:52.917727178 +0000 UTC m=+153.056423949" watchObservedRunningTime="2025-12-01 00:09:52.92149803 +0000 UTC m=+153.060194801" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.015217 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.470401 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:53 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:53 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:53 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.470940 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.501448 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.513825 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2v4g2" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.791217 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.805009 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jfg64" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.812206 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.816170 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume\") pod \"3dd65335-e065-4572-9e3d-912fe012056b\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.816286 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume\") pod \"3dd65335-e065-4572-9e3d-912fe012056b\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.816345 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzzwb\" (UniqueName: \"kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb\") pod \"3dd65335-e065-4572-9e3d-912fe012056b\" (UID: \"3dd65335-e065-4572-9e3d-912fe012056b\") " Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.845287 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3dd65335-e065-4572-9e3d-912fe012056b" (UID: "3dd65335-e065-4572-9e3d-912fe012056b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.873643 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3dd65335-e065-4572-9e3d-912fe012056b" (UID: "3dd65335-e065-4572-9e3d-912fe012056b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.874588 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb" (OuterVolumeSpecName: "kube-api-access-pzzwb") pod "3dd65335-e065-4572-9e3d-912fe012056b" (UID: "3dd65335-e065-4572-9e3d-912fe012056b"). InnerVolumeSpecName "kube-api-access-pzzwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.924550 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3dd65335-e065-4572-9e3d-912fe012056b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.924577 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dd65335-e065-4572-9e3d-912fe012056b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:09:53 crc kubenswrapper[4911]: I1201 00:09:53.924588 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzzwb\" (UniqueName: \"kubernetes.io/projected/3dd65335-e065-4572-9e3d-912fe012056b-kube-api-access-pzzwb\") on node \"crc\" DevicePath \"\"" Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.007419 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" event={"ID":"3dd65335-e065-4572-9e3d-912fe012056b","Type":"ContainerDied","Data":"57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d"} Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.007481 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ef82511266f563b3287a8082c344024ee242e67739a29775bd06bb260ae07d" Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.007500 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.009560 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m" Dec 01 00:09:54 crc kubenswrapper[4911]: W1201 00:09:54.082723 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ac09675_4f70_4a5f_b284_fc32a10f0c9a.slice/crio-b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe WatchSource:0}: Error finding container b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe: Status 404 returned error can't find the container with id b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.436832 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4v62n" Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.478288 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:54 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:54 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:54 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.478361 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:54 crc kubenswrapper[4911]: I1201 00:09:54.527190 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:09:55 crc kubenswrapper[4911]: I1201 00:09:55.040815 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ac09675-4f70-4a5f-b284-fc32a10f0c9a","Type":"ContainerStarted","Data":"b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe"} Dec 01 00:09:55 crc kubenswrapper[4911]: I1201 00:09:55.052196 4911 generic.go:334] "Generic (PLEG): container finished" podID="7577501b-be27-4bc0-93fd-2858e7d770bc" containerID="a60f7013ecda7e38b0839c4d1b089d13914d00aa85295c0f1118cab4a077fbcc" exitCode=0 Dec 01 00:09:55 crc kubenswrapper[4911]: I1201 00:09:55.052245 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7577501b-be27-4bc0-93fd-2858e7d770bc","Type":"ContainerDied","Data":"a60f7013ecda7e38b0839c4d1b089d13914d00aa85295c0f1118cab4a077fbcc"} Dec 01 00:09:55 crc kubenswrapper[4911]: I1201 00:09:55.463009 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:55 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:55 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:55 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:55 crc kubenswrapper[4911]: I1201 00:09:55.463682 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.094104 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ac09675-4f70-4a5f-b284-fc32a10f0c9a","Type":"ContainerStarted","Data":"e451885315a009d3d6a415c63ef16bb64979479e3c853c1d835b04ae01bd6c5e"} Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.460611 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:56 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:56 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:56 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.460658 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.756068 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.877931 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access\") pod \"7577501b-be27-4bc0-93fd-2858e7d770bc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.877983 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir\") pod \"7577501b-be27-4bc0-93fd-2858e7d770bc\" (UID: \"7577501b-be27-4bc0-93fd-2858e7d770bc\") " Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.878248 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7577501b-be27-4bc0-93fd-2858e7d770bc" (UID: "7577501b-be27-4bc0-93fd-2858e7d770bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.892661 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7577501b-be27-4bc0-93fd-2858e7d770bc" (UID: "7577501b-be27-4bc0-93fd-2858e7d770bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.979392 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7577501b-be27-4bc0-93fd-2858e7d770bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:09:56 crc kubenswrapper[4911]: I1201 00:09:56.979436 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7577501b-be27-4bc0-93fd-2858e7d770bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.130113 4911 generic.go:334] "Generic (PLEG): container finished" podID="4ac09675-4f70-4a5f-b284-fc32a10f0c9a" containerID="e451885315a009d3d6a415c63ef16bb64979479e3c853c1d835b04ae01bd6c5e" exitCode=0 Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.130200 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ac09675-4f70-4a5f-b284-fc32a10f0c9a","Type":"ContainerDied","Data":"e451885315a009d3d6a415c63ef16bb64979479e3c853c1d835b04ae01bd6c5e"} Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.160636 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7577501b-be27-4bc0-93fd-2858e7d770bc","Type":"ContainerDied","Data":"137b7058b0e6f3d3c9d48bf4085e600bc55350a4239a98a9cc20bdcb3a060dcb"} Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.160673 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="137b7058b0e6f3d3c9d48bf4085e600bc55350a4239a98a9cc20bdcb3a060dcb" Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.160750 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.463277 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:57 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:57 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:57 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:57 crc kubenswrapper[4911]: I1201 00:09:57.463322 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.464113 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:58 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:58 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:58 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.464490 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.680333 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.680424 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.681116 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.681180 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.757240 4911 patch_prober.go:28] interesting pod/console-f9d7485db-5twfs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 01 00:09:58 crc kubenswrapper[4911]: I1201 00:09:58.757295 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5twfs" podUID="017917e8-8480-473b-858b-46626ef5f770" containerName="console" probeResult="failure" output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 01 00:09:59 crc kubenswrapper[4911]: I1201 00:09:59.462502 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:09:59 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:09:59 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:09:59 crc kubenswrapper[4911]: healthz check failed Dec 01 00:09:59 crc kubenswrapper[4911]: I1201 00:09:59.462595 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:10:00 crc kubenswrapper[4911]: I1201 00:10:00.460905 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:10:00 crc kubenswrapper[4911]: [-]has-synced failed: reason withheld Dec 01 00:10:00 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:10:00 crc kubenswrapper[4911]: healthz check failed Dec 01 00:10:00 crc kubenswrapper[4911]: I1201 00:10:00.461414 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:10:01 crc kubenswrapper[4911]: I1201 00:10:01.462045 4911 patch_prober.go:28] interesting pod/router-default-5444994796-l6g55 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 00:10:01 crc kubenswrapper[4911]: [+]has-synced ok Dec 01 00:10:01 crc kubenswrapper[4911]: [+]process-running ok Dec 01 00:10:01 crc kubenswrapper[4911]: healthz check failed Dec 01 00:10:01 crc kubenswrapper[4911]: I1201 00:10:01.462129 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l6g55" podUID="3daa9b4f-c005-418c-854c-a81a04ab607a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 00:10:02 crc kubenswrapper[4911]: I1201 00:10:02.463611 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:10:02 crc kubenswrapper[4911]: I1201 00:10:02.466951 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-l6g55" Dec 01 00:10:04 crc kubenswrapper[4911]: I1201 00:10:04.651174 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:10:04 crc kubenswrapper[4911]: I1201 00:10:04.658250 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10941e4a-3eac-4ef3-a814-c83adcea347e-metrics-certs\") pod \"network-metrics-daemon-bzs4g\" (UID: \"10941e4a-3eac-4ef3-a814-c83adcea347e\") " pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:10:04 crc kubenswrapper[4911]: I1201 00:10:04.787423 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzs4g" Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.606623 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.695866 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access\") pod \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.695956 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir\") pod \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\" (UID: \"4ac09675-4f70-4a5f-b284-fc32a10f0c9a\") " Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.696060 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ac09675-4f70-4a5f-b284-fc32a10f0c9a" (UID: "4ac09675-4f70-4a5f-b284-fc32a10f0c9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.696513 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.726555 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ac09675-4f70-4a5f-b284-fc32a10f0c9a" (UID: "4ac09675-4f70-4a5f-b284-fc32a10f0c9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:10:07 crc kubenswrapper[4911]: I1201 00:10:07.802682 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ac09675-4f70-4a5f-b284-fc32a10f0c9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.247132 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzs4g"] Dec 01 00:10:08 crc kubenswrapper[4911]: W1201 00:10:08.252827 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10941e4a_3eac_4ef3_a814_c83adcea347e.slice/crio-0e58e8fd8f04d4c15b2cb1a9481b8473a4d28a3c395fd5885d0e3d4477467bca WatchSource:0}: Error finding container 0e58e8fd8f04d4c15b2cb1a9481b8473a4d28a3c395fd5885d0e3d4477467bca: Status 404 returned error can't find the container with id 0e58e8fd8f04d4c15b2cb1a9481b8473a4d28a3c395fd5885d0e3d4477467bca Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.412541 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" event={"ID":"10941e4a-3eac-4ef3-a814-c83adcea347e","Type":"ContainerStarted","Data":"0e58e8fd8f04d4c15b2cb1a9481b8473a4d28a3c395fd5885d0e3d4477467bca"} Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.414173 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4ac09675-4f70-4a5f-b284-fc32a10f0c9a","Type":"ContainerDied","Data":"b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe"} Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.414203 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b95fbf92a36c7929ed35108fb5e05048414d3036a19190628a025577a8a91ffe" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.414273 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.521740 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.679898 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.679949 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.680354 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.680421 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.680537 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.681191 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"fc2730a88fba6e28546cf83161b6b0bbb24f524351875f27b930f14b6d016b33"} pod="openshift-console/downloads-7954f5f757-ckndj" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.681322 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" containerID="cri-o://fc2730a88fba6e28546cf83161b6b0bbb24f524351875f27b930f14b6d016b33" gracePeriod=2 Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.681649 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.681753 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.761468 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:10:08 crc kubenswrapper[4911]: I1201 00:10:08.765953 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5twfs" Dec 01 00:10:09 crc kubenswrapper[4911]: I1201 00:10:09.422277 4911 generic.go:334] "Generic (PLEG): container finished" podID="92365c00-de26-4e68-89d2-724cf199e249" containerID="fc2730a88fba6e28546cf83161b6b0bbb24f524351875f27b930f14b6d016b33" exitCode=0 Dec 01 00:10:09 crc kubenswrapper[4911]: I1201 00:10:09.422393 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ckndj" event={"ID":"92365c00-de26-4e68-89d2-724cf199e249","Type":"ContainerDied","Data":"fc2730a88fba6e28546cf83161b6b0bbb24f524351875f27b930f14b6d016b33"} Dec 01 00:10:09 crc kubenswrapper[4911]: I1201 00:10:09.424009 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" event={"ID":"10941e4a-3eac-4ef3-a814-c83adcea347e","Type":"ContainerStarted","Data":"c3b353c519bebd386ba6a4338eb34a4e431980eea2f6f40c509d6be206468913"} Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.455476 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ckndj" event={"ID":"92365c00-de26-4e68-89d2-724cf199e249","Type":"ContainerStarted","Data":"d18dc9c1978d09fe99e0222327b0e078410f3193dafb810713c99cedaddbab14"} Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.458999 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.460787 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.460973 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.464673 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzs4g" event={"ID":"10941e4a-3eac-4ef3-a814-c83adcea347e","Type":"ContainerStarted","Data":"e8225f51ed2fba09e11cfb5550c67acba25e539827b6b01f2199a40f242e254f"} Dec 01 00:10:11 crc kubenswrapper[4911]: I1201 00:10:11.510036 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bzs4g" podStartSLOduration=150.51001319 podStartE2EDuration="2m30.51001319s" podCreationTimestamp="2025-12-01 00:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:10:11.50483566 +0000 UTC m=+171.643532481" watchObservedRunningTime="2025-12-01 00:10:11.51001319 +0000 UTC m=+171.648709971" Dec 01 00:10:12 crc kubenswrapper[4911]: I1201 00:10:12.472565 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:12 crc kubenswrapper[4911]: I1201 00:10:12.473169 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:17 crc kubenswrapper[4911]: I1201 00:10:17.506110 4911 generic.go:334] "Generic (PLEG): container finished" podID="1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" containerID="9a1c1af3596612873fd0d0f3462aa0a6065423485922b2e98dfc8ecccb87aefe" exitCode=0 Dec 01 00:10:17 crc kubenswrapper[4911]: I1201 00:10:17.506512 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29409120-hbwz5" event={"ID":"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc","Type":"ContainerDied","Data":"9a1c1af3596612873fd0d0f3462aa0a6065423485922b2e98dfc8ecccb87aefe"} Dec 01 00:10:18 crc kubenswrapper[4911]: I1201 00:10:18.465692 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5s5s4" Dec 01 00:10:18 crc kubenswrapper[4911]: I1201 00:10:18.681047 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:18 crc kubenswrapper[4911]: I1201 00:10:18.681135 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:18 crc kubenswrapper[4911]: I1201 00:10:18.681186 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:18 crc kubenswrapper[4911]: I1201 00:10:18.681131 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:21 crc kubenswrapper[4911]: I1201 00:10:21.311890 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:10:21 crc kubenswrapper[4911]: I1201 00:10:21.312281 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.860697 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 00:10:24 crc kubenswrapper[4911]: E1201 00:10:24.861732 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac09675-4f70-4a5f-b284-fc32a10f0c9a" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.861809 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac09675-4f70-4a5f-b284-fc32a10f0c9a" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: E1201 00:10:24.861878 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7577501b-be27-4bc0-93fd-2858e7d770bc" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.861901 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7577501b-be27-4bc0-93fd-2858e7d770bc" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: E1201 00:10:24.861926 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd65335-e065-4572-9e3d-912fe012056b" containerName="collect-profiles" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.861942 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd65335-e065-4572-9e3d-912fe012056b" containerName="collect-profiles" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.862401 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac09675-4f70-4a5f-b284-fc32a10f0c9a" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.862567 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7577501b-be27-4bc0-93fd-2858e7d770bc" containerName="pruner" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.862594 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd65335-e065-4572-9e3d-912fe012056b" containerName="collect-profiles" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.863614 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.866950 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.874977 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.881034 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.924175 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:24 crc kubenswrapper[4911]: I1201 00:10:24.924311 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:25 crc kubenswrapper[4911]: I1201 00:10:25.026495 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:25 crc kubenswrapper[4911]: I1201 00:10:25.026687 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:25 crc kubenswrapper[4911]: I1201 00:10:25.026793 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:25 crc kubenswrapper[4911]: I1201 00:10:25.053422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:25 crc kubenswrapper[4911]: I1201 00:10:25.206318 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.620995 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.679613 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.679699 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.679982 4911 patch_prober.go:28] interesting pod/downloads-7954f5f757-ckndj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.680047 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ckndj" podUID="92365c00-de26-4e68-89d2-724cf199e249" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.887322 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.986597 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whrf7\" (UniqueName: \"kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7\") pod \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.986662 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca\") pod \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\" (UID: \"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc\") " Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.987810 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca" (OuterVolumeSpecName: "serviceca") pod "1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" (UID: "1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:10:28 crc kubenswrapper[4911]: I1201 00:10:28.994630 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7" (OuterVolumeSpecName: "kube-api-access-whrf7") pod "1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" (UID: "1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc"). InnerVolumeSpecName "kube-api-access-whrf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:10:29 crc kubenswrapper[4911]: I1201 00:10:29.088626 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whrf7\" (UniqueName: \"kubernetes.io/projected/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-kube-api-access-whrf7\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:29 crc kubenswrapper[4911]: I1201 00:10:29.088674 4911 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:29 crc kubenswrapper[4911]: I1201 00:10:29.593915 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29409120-hbwz5" event={"ID":"1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc","Type":"ContainerDied","Data":"adc453bb9ede638da36ad16fcce40682ec7275f7738d1ed52894a4246024caf7"} Dec 01 00:10:29 crc kubenswrapper[4911]: I1201 00:10:29.594275 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc453bb9ede638da36ad16fcce40682ec7275f7738d1ed52894a4246024caf7" Dec 01 00:10:29 crc kubenswrapper[4911]: I1201 00:10:29.594000 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29409120-hbwz5" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.250878 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 00:10:30 crc kubenswrapper[4911]: E1201 00:10:30.251150 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" containerName="image-pruner" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.251167 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" containerName="image-pruner" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.251278 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a62b9e4-12f1-43cc-ac16-ad8cc69a08fc" containerName="image-pruner" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.251788 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.270952 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.305937 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.306007 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.306399 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.407586 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.407661 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.407696 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.407740 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.407823 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.427420 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access\") pod \"installer-9-crc\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:30 crc kubenswrapper[4911]: I1201 00:10:30.585213 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.077039 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.077662 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wjbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5drp7_openshift-marketplace(a55ff1f6-20d7-435a-9764-59a0b24f7000): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.078849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5drp7" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.869697 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.869926 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgbkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-txvnj_openshift-marketplace(5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.871227 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-txvnj" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" Dec 01 00:10:34 crc kubenswrapper[4911]: E1201 00:10:34.878914 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5drp7" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" Dec 01 00:10:35 crc kubenswrapper[4911]: E1201 00:10:35.435128 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 00:10:35 crc kubenswrapper[4911]: E1201 00:10:35.435684 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6l6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f9xb9_openshift-marketplace(9a33691a-6c8a-47ac-9d8a-cce2a68425e7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:35 crc kubenswrapper[4911]: E1201 00:10:35.436987 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f9xb9" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" Dec 01 00:10:37 crc kubenswrapper[4911]: E1201 00:10:37.955978 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-txvnj" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" Dec 01 00:10:37 crc kubenswrapper[4911]: E1201 00:10:37.956185 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f9xb9" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" Dec 01 00:10:38 crc kubenswrapper[4911]: I1201 00:10:38.705103 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ckndj" Dec 01 00:10:42 crc kubenswrapper[4911]: E1201 00:10:42.178435 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 00:10:42 crc kubenswrapper[4911]: E1201 00:10:42.179017 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5f2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pntq8_openshift-marketplace(fe55751b-c29f-4c22-a636-56c9e3232fdf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:42 crc kubenswrapper[4911]: E1201 00:10:42.180289 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pntq8" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" Dec 01 00:10:44 crc kubenswrapper[4911]: E1201 00:10:44.236223 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pntq8" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" Dec 01 00:10:44 crc kubenswrapper[4911]: I1201 00:10:44.694989 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 00:10:44 crc kubenswrapper[4911]: W1201 00:10:44.709004 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod71e24cc0_ba97_4a02_a743_f0946de5df74.slice/crio-afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e WatchSource:0}: Error finding container afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e: Status 404 returned error can't find the container with id afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e Dec 01 00:10:44 crc kubenswrapper[4911]: I1201 00:10:44.804779 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 00:10:45 crc kubenswrapper[4911]: I1201 00:10:45.707078 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d","Type":"ContainerStarted","Data":"5dd01f15084b4e6e042d2f856c8992899be6a5d8f2e86df4c588316c9a2b7e6d"} Dec 01 00:10:45 crc kubenswrapper[4911]: I1201 00:10:45.710051 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71e24cc0-ba97-4a02-a743-f0946de5df74","Type":"ContainerStarted","Data":"afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e"} Dec 01 00:10:47 crc kubenswrapper[4911]: I1201 00:10:47.723900 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71e24cc0-ba97-4a02-a743-f0946de5df74","Type":"ContainerStarted","Data":"7da3114655d8c1519c67b8b3241e32bfec976271c6f0e0ae427563ed516d8bf1"} Dec 01 00:10:48 crc kubenswrapper[4911]: I1201 00:10:48.734635 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d","Type":"ContainerStarted","Data":"419f645f23937820d44d410218cef7477e8c0af05ec2470abf1c4cf595b0d226"} Dec 01 00:10:48 crc kubenswrapper[4911]: I1201 00:10:48.768348 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.768315647 podStartE2EDuration="18.768315647s" podCreationTimestamp="2025-12-01 00:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:10:48.761296782 +0000 UTC m=+208.899993583" watchObservedRunningTime="2025-12-01 00:10:48.768315647 +0000 UTC m=+208.907012448" Dec 01 00:10:48 crc kubenswrapper[4911]: I1201 00:10:48.783082 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=24.783051676 podStartE2EDuration="24.783051676s" podCreationTimestamp="2025-12-01 00:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:10:48.781414873 +0000 UTC m=+208.920111674" watchObservedRunningTime="2025-12-01 00:10:48.783051676 +0000 UTC m=+208.921748477" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.310170 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.310862 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmmvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ffmzr_openshift-marketplace(ed713643-05a3-45af-a821-b053054528dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.312185 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ffmzr" podUID="ed713643-05a3-45af-a821-b053054528dd" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.348621 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.348823 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fb88w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8n99g_openshift-marketplace(b0ebaddd-1c4a-44c5-ab95-54a174396b80): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.350027 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8n99g" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.356687 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.356781 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjdvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cm29t_openshift-marketplace(a2963529-140d-4683-bc00-448f1aa1ed51): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.358177 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cm29t" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.743566 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cm29t" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.743885 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8n99g" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" Dec 01 00:10:49 crc kubenswrapper[4911]: E1201 00:10:49.746350 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ffmzr" podUID="ed713643-05a3-45af-a821-b053054528dd" Dec 01 00:10:50 crc kubenswrapper[4911]: I1201 00:10:50.748878 4911 generic.go:334] "Generic (PLEG): container finished" podID="71e24cc0-ba97-4a02-a743-f0946de5df74" containerID="7da3114655d8c1519c67b8b3241e32bfec976271c6f0e0ae427563ed516d8bf1" exitCode=0 Dec 01 00:10:50 crc kubenswrapper[4911]: I1201 00:10:50.748937 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71e24cc0-ba97-4a02-a743-f0946de5df74","Type":"ContainerDied","Data":"7da3114655d8c1519c67b8b3241e32bfec976271c6f0e0ae427563ed516d8bf1"} Dec 01 00:10:51 crc kubenswrapper[4911]: I1201 00:10:51.312171 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:10:51 crc kubenswrapper[4911]: I1201 00:10:51.312260 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:10:51 crc kubenswrapper[4911]: I1201 00:10:51.312347 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:10:51 crc kubenswrapper[4911]: I1201 00:10:51.313284 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:10:51 crc kubenswrapper[4911]: I1201 00:10:51.313416 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3" gracePeriod=600 Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.062944 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.219410 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir\") pod \"71e24cc0-ba97-4a02-a743-f0946de5df74\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.219591 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71e24cc0-ba97-4a02-a743-f0946de5df74" (UID: "71e24cc0-ba97-4a02-a743-f0946de5df74"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.219955 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access\") pod \"71e24cc0-ba97-4a02-a743-f0946de5df74\" (UID: \"71e24cc0-ba97-4a02-a743-f0946de5df74\") " Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.220608 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71e24cc0-ba97-4a02-a743-f0946de5df74-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.229067 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71e24cc0-ba97-4a02-a743-f0946de5df74" (UID: "71e24cc0-ba97-4a02-a743-f0946de5df74"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.321487 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71e24cc0-ba97-4a02-a743-f0946de5df74-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.765070 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71e24cc0-ba97-4a02-a743-f0946de5df74","Type":"ContainerDied","Data":"afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e"} Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.765131 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afdacfef264533ace6d27aa5750cec13ef706d868966cc9ad30537214ab4964e" Dec 01 00:10:52 crc kubenswrapper[4911]: I1201 00:10:52.765146 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 00:10:54 crc kubenswrapper[4911]: E1201 00:10:54.408120 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 00:10:54 crc kubenswrapper[4911]: E1201 00:10:54.408726 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h82kq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7k7j2_openshift-marketplace(5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:10:54 crc kubenswrapper[4911]: E1201 00:10:54.409995 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7k7j2" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" Dec 01 00:10:54 crc kubenswrapper[4911]: I1201 00:10:54.801182 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3" exitCode=0 Dec 01 00:10:54 crc kubenswrapper[4911]: I1201 00:10:54.801529 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3"} Dec 01 00:10:54 crc kubenswrapper[4911]: E1201 00:10:54.808347 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7k7j2" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.811520 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerID="240b25678cb2dcd751580908e3275fdb0e9ac474a36027a15b161df9d288dfc7" exitCode=0 Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.811602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerDied","Data":"240b25678cb2dcd751580908e3275fdb0e9ac474a36027a15b161df9d288dfc7"} Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.814754 4911 generic.go:334] "Generic (PLEG): container finished" podID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerID="92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98" exitCode=0 Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.814821 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerDied","Data":"92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98"} Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.818226 4911 generic.go:334] "Generic (PLEG): container finished" podID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerID="35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a" exitCode=0 Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.818283 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerDied","Data":"35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a"} Dec 01 00:10:55 crc kubenswrapper[4911]: I1201 00:10:55.821930 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386"} Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.828943 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerStarted","Data":"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6"} Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.830640 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerStarted","Data":"1eeb8fc3b6c38a0811635435e6b28e376735a0962606d2fc259a3c285b58a34f"} Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.832475 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerStarted","Data":"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80"} Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.851936 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txvnj" podStartSLOduration=4.940933289 podStartE2EDuration="1m10.851918336s" podCreationTimestamp="2025-12-01 00:09:46 +0000 UTC" firstStartedPulling="2025-12-01 00:09:50.634941989 +0000 UTC m=+150.773638760" lastFinishedPulling="2025-12-01 00:10:56.545926996 +0000 UTC m=+216.684623807" observedRunningTime="2025-12-01 00:10:56.848995219 +0000 UTC m=+216.987692020" watchObservedRunningTime="2025-12-01 00:10:56.851918336 +0000 UTC m=+216.990615107" Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.879039 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5drp7" podStartSLOduration=5.270711491 podStartE2EDuration="1m10.87901334s" podCreationTimestamp="2025-12-01 00:09:46 +0000 UTC" firstStartedPulling="2025-12-01 00:09:50.669897567 +0000 UTC m=+150.808594338" lastFinishedPulling="2025-12-01 00:10:56.278199406 +0000 UTC m=+216.416896187" observedRunningTime="2025-12-01 00:10:56.877403198 +0000 UTC m=+217.016099999" watchObservedRunningTime="2025-12-01 00:10:56.87901334 +0000 UTC m=+217.017710111" Dec 01 00:10:56 crc kubenswrapper[4911]: I1201 00:10:56.898735 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9xb9" podStartSLOduration=6.070149737 podStartE2EDuration="1m12.89871236s" podCreationTimestamp="2025-12-01 00:09:44 +0000 UTC" firstStartedPulling="2025-12-01 00:09:49.487941073 +0000 UTC m=+149.626637844" lastFinishedPulling="2025-12-01 00:10:56.316503696 +0000 UTC m=+216.455200467" observedRunningTime="2025-12-01 00:10:56.896365488 +0000 UTC m=+217.035062269" watchObservedRunningTime="2025-12-01 00:10:56.89871236 +0000 UTC m=+217.037409131" Dec 01 00:10:57 crc kubenswrapper[4911]: I1201 00:10:57.219069 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:10:57 crc kubenswrapper[4911]: I1201 00:10:57.219127 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:10:57 crc kubenswrapper[4911]: I1201 00:10:57.404882 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:10:57 crc kubenswrapper[4911]: I1201 00:10:57.404950 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:10:58 crc kubenswrapper[4911]: I1201 00:10:58.278490 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5drp7" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="registry-server" probeResult="failure" output=< Dec 01 00:10:58 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Dec 01 00:10:58 crc kubenswrapper[4911]: > Dec 01 00:10:58 crc kubenswrapper[4911]: I1201 00:10:58.449873 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-txvnj" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="registry-server" probeResult="failure" output=< Dec 01 00:10:58 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Dec 01 00:10:58 crc kubenswrapper[4911]: > Dec 01 00:10:59 crc kubenswrapper[4911]: I1201 00:10:59.859170 4911 generic.go:334] "Generic (PLEG): container finished" podID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerID="ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6" exitCode=0 Dec 01 00:10:59 crc kubenswrapper[4911]: I1201 00:10:59.859942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerDied","Data":"ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6"} Dec 01 00:11:01 crc kubenswrapper[4911]: I1201 00:11:01.888742 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerStarted","Data":"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148"} Dec 01 00:11:01 crc kubenswrapper[4911]: I1201 00:11:01.914159 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pntq8" podStartSLOduration=5.1331017899999996 podStartE2EDuration="1m17.914135153s" podCreationTimestamp="2025-12-01 00:09:44 +0000 UTC" firstStartedPulling="2025-12-01 00:09:48.320399879 +0000 UTC m=+148.459096650" lastFinishedPulling="2025-12-01 00:11:01.101433242 +0000 UTC m=+221.240130013" observedRunningTime="2025-12-01 00:11:01.912889891 +0000 UTC m=+222.051586682" watchObservedRunningTime="2025-12-01 00:11:01.914135153 +0000 UTC m=+222.052831934" Dec 01 00:11:02 crc kubenswrapper[4911]: I1201 00:11:02.899613 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed713643-05a3-45af-a821-b053054528dd" containerID="0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7" exitCode=0 Dec 01 00:11:02 crc kubenswrapper[4911]: I1201 00:11:02.899660 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerDied","Data":"0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7"} Dec 01 00:11:03 crc kubenswrapper[4911]: I1201 00:11:03.919472 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerStarted","Data":"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c"} Dec 01 00:11:03 crc kubenswrapper[4911]: I1201 00:11:03.943235 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ffmzr" podStartSLOduration=6.393762537 podStartE2EDuration="1m16.943216754s" podCreationTimestamp="2025-12-01 00:09:47 +0000 UTC" firstStartedPulling="2025-12-01 00:09:52.911713915 +0000 UTC m=+153.050410686" lastFinishedPulling="2025-12-01 00:11:03.461168132 +0000 UTC m=+223.599864903" observedRunningTime="2025-12-01 00:11:03.941218641 +0000 UTC m=+224.079915422" watchObservedRunningTime="2025-12-01 00:11:03.943216754 +0000 UTC m=+224.081913525" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.339147 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.339215 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.390777 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.578182 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.578233 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.625617 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:05 crc kubenswrapper[4911]: I1201 00:11:05.983640 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:11:07 crc kubenswrapper[4911]: I1201 00:11:07.261311 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:11:07 crc kubenswrapper[4911]: I1201 00:11:07.301540 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:11:07 crc kubenswrapper[4911]: I1201 00:11:07.452498 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:11:07 crc kubenswrapper[4911]: I1201 00:11:07.500430 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:11:08 crc kubenswrapper[4911]: I1201 00:11:08.657749 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:11:08 crc kubenswrapper[4911]: I1201 00:11:08.657805 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:11:08 crc kubenswrapper[4911]: I1201 00:11:08.958021 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerStarted","Data":"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656"} Dec 01 00:11:09 crc kubenswrapper[4911]: I1201 00:11:09.700170 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ffmzr" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="registry-server" probeResult="failure" output=< Dec 01 00:11:09 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Dec 01 00:11:09 crc kubenswrapper[4911]: > Dec 01 00:11:09 crc kubenswrapper[4911]: I1201 00:11:09.985119 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerID="10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656" exitCode=0 Dec 01 00:11:09 crc kubenswrapper[4911]: I1201 00:11:09.985259 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerDied","Data":"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656"} Dec 01 00:11:09 crc kubenswrapper[4911]: I1201 00:11:09.991859 4911 generic.go:334] "Generic (PLEG): container finished" podID="a2963529-140d-4683-bc00-448f1aa1ed51" containerID="db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be" exitCode=0 Dec 01 00:11:09 crc kubenswrapper[4911]: I1201 00:11:09.991902 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerDied","Data":"db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be"} Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.212130 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.212398 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txvnj" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="registry-server" containerID="cri-o://0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6" gracePeriod=2 Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.690714 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.746099 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgbkc\" (UniqueName: \"kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc\") pod \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.746167 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities\") pod \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.746189 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content\") pod \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\" (UID: \"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a\") " Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.749175 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities" (OuterVolumeSpecName: "utilities") pod "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" (UID: "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.753984 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc" (OuterVolumeSpecName: "kube-api-access-sgbkc") pod "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" (UID: "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a"). InnerVolumeSpecName "kube-api-access-sgbkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.766055 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" (UID: "5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.847878 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.847934 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:11 crc kubenswrapper[4911]: I1201 00:11:11.847960 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgbkc\" (UniqueName: \"kubernetes.io/projected/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a-kube-api-access-sgbkc\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.008160 4911 generic.go:334] "Generic (PLEG): container finished" podID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerID="d851babc3fadf86150c231e2fbe63b0bf1c99f01faf643bbef54670474cf1672" exitCode=0 Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.008234 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerDied","Data":"d851babc3fadf86150c231e2fbe63b0bf1c99f01faf643bbef54670474cf1672"} Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.011050 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerStarted","Data":"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a"} Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.016971 4911 generic.go:334] "Generic (PLEG): container finished" podID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerID="0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6" exitCode=0 Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.017108 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerDied","Data":"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6"} Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.017150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txvnj" event={"ID":"5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a","Type":"ContainerDied","Data":"b50dafb721de51cc1ddd0818365b29bbd3a73b6d5211cd020f34fd4f27752282"} Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.017194 4911 scope.go:117] "RemoveContainer" containerID="0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.017795 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txvnj" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.029601 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerStarted","Data":"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76"} Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.044398 4911 scope.go:117] "RemoveContainer" containerID="35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.052504 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cm29t" podStartSLOduration=5.7621627140000005 podStartE2EDuration="1m24.052477238s" podCreationTimestamp="2025-12-01 00:09:48 +0000 UTC" firstStartedPulling="2025-12-01 00:09:52.911739006 +0000 UTC m=+153.050435787" lastFinishedPulling="2025-12-01 00:11:11.20205352 +0000 UTC m=+231.340750311" observedRunningTime="2025-12-01 00:11:12.048535454 +0000 UTC m=+232.187232235" watchObservedRunningTime="2025-12-01 00:11:12.052477238 +0000 UTC m=+232.191174009" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.081685 4911 scope.go:117] "RemoveContainer" containerID="8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.087953 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8n99g" podStartSLOduration=6.254208141 podStartE2EDuration="1m28.087934223s" podCreationTimestamp="2025-12-01 00:09:44 +0000 UTC" firstStartedPulling="2025-12-01 00:09:49.48783732 +0000 UTC m=+149.626534091" lastFinishedPulling="2025-12-01 00:11:11.321563402 +0000 UTC m=+231.460260173" observedRunningTime="2025-12-01 00:11:12.072744072 +0000 UTC m=+232.211440873" watchObservedRunningTime="2025-12-01 00:11:12.087934223 +0000 UTC m=+232.226630994" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.088606 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.096812 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txvnj"] Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.105182 4911 scope.go:117] "RemoveContainer" containerID="0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6" Dec 01 00:11:12 crc kubenswrapper[4911]: E1201 00:11:12.106006 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6\": container with ID starting with 0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6 not found: ID does not exist" containerID="0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.106073 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6"} err="failed to get container status \"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6\": rpc error: code = NotFound desc = could not find container \"0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6\": container with ID starting with 0d7dc42283307be942c9a6359239ec356aafce2cd2c7c67682af1089430b6ce6 not found: ID does not exist" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.106129 4911 scope.go:117] "RemoveContainer" containerID="35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a" Dec 01 00:11:12 crc kubenswrapper[4911]: E1201 00:11:12.107770 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a\": container with ID starting with 35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a not found: ID does not exist" containerID="35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.107904 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a"} err="failed to get container status \"35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a\": rpc error: code = NotFound desc = could not find container \"35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a\": container with ID starting with 35d55f96d19010d5b0b5b479bf799dbf12768f2529752b9143d6ac3c0fe39f2a not found: ID does not exist" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.108045 4911 scope.go:117] "RemoveContainer" containerID="8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493" Dec 01 00:11:12 crc kubenswrapper[4911]: E1201 00:11:12.110979 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493\": container with ID starting with 8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493 not found: ID does not exist" containerID="8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.111037 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493"} err="failed to get container status \"8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493\": rpc error: code = NotFound desc = could not find container \"8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493\": container with ID starting with 8bdf4711567aa73d2dafc376885dac07d8b13b4a6f55253e249cbf471ecb5493 not found: ID does not exist" Dec 01 00:11:12 crc kubenswrapper[4911]: I1201 00:11:12.162604 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" path="/var/lib/kubelet/pods/5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a/volumes" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.055168 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerStarted","Data":"daea890e1303b126b08c8489aa3c51af3c462f1d830b7bda646c621b69a007db"} Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.075207 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7k7j2" podStartSLOduration=4.983322567 podStartE2EDuration="1m29.075184537s" podCreationTimestamp="2025-12-01 00:09:44 +0000 UTC" firstStartedPulling="2025-12-01 00:09:48.39453955 +0000 UTC m=+148.533236321" lastFinishedPulling="2025-12-01 00:11:12.48640152 +0000 UTC m=+232.625098291" observedRunningTime="2025-12-01 00:11:13.073319258 +0000 UTC m=+233.212016029" watchObservedRunningTime="2025-12-01 00:11:13.075184537 +0000 UTC m=+233.213881308" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913373 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kc7gz"] Dec 01 00:11:13 crc kubenswrapper[4911]: E1201 00:11:13.913591 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="extract-content" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913606 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="extract-content" Dec 01 00:11:13 crc kubenswrapper[4911]: E1201 00:11:13.913616 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e24cc0-ba97-4a02-a743-f0946de5df74" containerName="pruner" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913621 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e24cc0-ba97-4a02-a743-f0946de5df74" containerName="pruner" Dec 01 00:11:13 crc kubenswrapper[4911]: E1201 00:11:13.913632 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="extract-utilities" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913639 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="extract-utilities" Dec 01 00:11:13 crc kubenswrapper[4911]: E1201 00:11:13.913652 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="registry-server" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913658 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="registry-server" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913749 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e24cc0-ba97-4a02-a743-f0946de5df74" containerName="pruner" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.913759 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5808ddf7-b1d2-420b-b7c6-46a5be4a1d1a" containerName="registry-server" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.914136 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:13 crc kubenswrapper[4911]: I1201 00:11:13.930082 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kc7gz"] Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076552 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076640 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27b41bdd-128a-4561-acfe-e75882799f6a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076727 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-registry-certificates\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076757 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-bound-sa-token\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076784 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27b41bdd-128a-4561-acfe-e75882799f6a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076806 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-trusted-ca\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076828 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-registry-tls\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.076864 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgmf\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-kube-api-access-vtgmf\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.104052 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178110 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27b41bdd-128a-4561-acfe-e75882799f6a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178181 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-registry-certificates\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178202 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-bound-sa-token\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178225 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27b41bdd-128a-4561-acfe-e75882799f6a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178246 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-trusted-ca\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178263 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-registry-tls\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.178294 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgmf\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-kube-api-access-vtgmf\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.179181 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27b41bdd-128a-4561-acfe-e75882799f6a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.180339 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-registry-certificates\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.180421 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27b41bdd-128a-4561-acfe-e75882799f6a-trusted-ca\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.188209 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-registry-tls\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.190429 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27b41bdd-128a-4561-acfe-e75882799f6a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.198807 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgmf\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-kube-api-access-vtgmf\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.200166 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27b41bdd-128a-4561-acfe-e75882799f6a-bound-sa-token\") pod \"image-registry-66df7c8f76-kc7gz\" (UID: \"27b41bdd-128a-4561-acfe-e75882799f6a\") " pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.236274 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.525119 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kc7gz"] Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.925190 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.925302 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:11:14 crc kubenswrapper[4911]: I1201 00:11:14.970989 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:11:15 crc kubenswrapper[4911]: I1201 00:11:15.070480 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" event={"ID":"27b41bdd-128a-4561-acfe-e75882799f6a","Type":"ContainerStarted","Data":"bf317cac5aa00d0ee9b94b3a270bc7b72e90465408589e4c72e4125e79ce8190"} Dec 01 00:11:15 crc kubenswrapper[4911]: I1201 00:11:15.652478 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:15 crc kubenswrapper[4911]: I1201 00:11:15.713502 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:15 crc kubenswrapper[4911]: I1201 00:11:15.713565 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:15 crc kubenswrapper[4911]: I1201 00:11:15.760825 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:16 crc kubenswrapper[4911]: I1201 00:11:16.116945 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:17 crc kubenswrapper[4911]: I1201 00:11:17.084255 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" event={"ID":"27b41bdd-128a-4561-acfe-e75882799f6a","Type":"ContainerStarted","Data":"9ff41666c7f36cdb71b31ae221e20cc042e12c299be1fd1415fe8406d040f870"} Dec 01 00:11:17 crc kubenswrapper[4911]: I1201 00:11:17.111174 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" podStartSLOduration=4.111142242 podStartE2EDuration="4.111142242s" podCreationTimestamp="2025-12-01 00:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:11:17.104125397 +0000 UTC m=+237.242822168" watchObservedRunningTime="2025-12-01 00:11:17.111142242 +0000 UTC m=+237.249839013" Dec 01 00:11:17 crc kubenswrapper[4911]: I1201 00:11:17.411108 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:11:17 crc kubenswrapper[4911]: I1201 00:11:17.926090 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9qfhm"] Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.090798 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8n99g" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="registry-server" containerID="cri-o://a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76" gracePeriod=2 Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.091632 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.705820 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.712446 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.712550 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.753516 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:11:18 crc kubenswrapper[4911]: I1201 00:11:18.784124 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.144526 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.568355 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.609415 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.610068 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pntq8" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="registry-server" containerID="cri-o://4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148" gracePeriod=2 Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.671017 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content\") pod \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.671095 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb88w\" (UniqueName: \"kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w\") pod \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.671136 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities\") pod \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\" (UID: \"b0ebaddd-1c4a-44c5-ab95-54a174396b80\") " Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.672041 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities" (OuterVolumeSpecName: "utilities") pod "b0ebaddd-1c4a-44c5-ab95-54a174396b80" (UID: "b0ebaddd-1c4a-44c5-ab95-54a174396b80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.676538 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w" (OuterVolumeSpecName: "kube-api-access-fb88w") pod "b0ebaddd-1c4a-44c5-ab95-54a174396b80" (UID: "b0ebaddd-1c4a-44c5-ab95-54a174396b80"). InnerVolumeSpecName "kube-api-access-fb88w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.729015 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0ebaddd-1c4a-44c5-ab95-54a174396b80" (UID: "b0ebaddd-1c4a-44c5-ab95-54a174396b80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.772698 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.773117 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb88w\" (UniqueName: \"kubernetes.io/projected/b0ebaddd-1c4a-44c5-ab95-54a174396b80-kube-api-access-fb88w\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.773135 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ebaddd-1c4a-44c5-ab95-54a174396b80-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:19 crc kubenswrapper[4911]: I1201 00:11:19.972647 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.075733 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5f2v\" (UniqueName: \"kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v\") pod \"fe55751b-c29f-4c22-a636-56c9e3232fdf\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.075805 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities\") pod \"fe55751b-c29f-4c22-a636-56c9e3232fdf\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.075861 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content\") pod \"fe55751b-c29f-4c22-a636-56c9e3232fdf\" (UID: \"fe55751b-c29f-4c22-a636-56c9e3232fdf\") " Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.077354 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities" (OuterVolumeSpecName: "utilities") pod "fe55751b-c29f-4c22-a636-56c9e3232fdf" (UID: "fe55751b-c29f-4c22-a636-56c9e3232fdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.079268 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v" (OuterVolumeSpecName: "kube-api-access-z5f2v") pod "fe55751b-c29f-4c22-a636-56c9e3232fdf" (UID: "fe55751b-c29f-4c22-a636-56c9e3232fdf"). InnerVolumeSpecName "kube-api-access-z5f2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.108361 4911 generic.go:334] "Generic (PLEG): container finished" podID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerID="4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148" exitCode=0 Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.108435 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pntq8" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.108442 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerDied","Data":"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148"} Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.108500 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pntq8" event={"ID":"fe55751b-c29f-4c22-a636-56c9e3232fdf","Type":"ContainerDied","Data":"fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda"} Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.108523 4911 scope.go:117] "RemoveContainer" containerID="4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.112929 4911 generic.go:334] "Generic (PLEG): container finished" podID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerID="a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76" exitCode=0 Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.113535 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerDied","Data":"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76"} Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.113575 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8n99g" event={"ID":"b0ebaddd-1c4a-44c5-ab95-54a174396b80","Type":"ContainerDied","Data":"b4892b0e3d1354bc6307ab036afc8d5481e5dc2905b18f24dcb1d1903b217a22"} Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.113626 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8n99g" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.133927 4911 scope.go:117] "RemoveContainer" containerID="ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.139873 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe55751b-c29f-4c22-a636-56c9e3232fdf" (UID: "fe55751b-c29f-4c22-a636-56c9e3232fdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.173648 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.173688 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8n99g"] Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.175845 4911 scope.go:117] "RemoveContainer" containerID="439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.177044 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.177067 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5f2v\" (UniqueName: \"kubernetes.io/projected/fe55751b-c29f-4c22-a636-56c9e3232fdf-kube-api-access-z5f2v\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.177079 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe55751b-c29f-4c22-a636-56c9e3232fdf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.190491 4911 scope.go:117] "RemoveContainer" containerID="4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.190980 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148\": container with ID starting with 4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148 not found: ID does not exist" containerID="4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.191022 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148"} err="failed to get container status \"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148\": rpc error: code = NotFound desc = could not find container \"4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148\": container with ID starting with 4de4850352ed3151d91c57d30f5cdc29fb5e7b219cfe3392ad51641d6bfb9148 not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.191052 4911 scope.go:117] "RemoveContainer" containerID="ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.191420 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6\": container with ID starting with ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6 not found: ID does not exist" containerID="ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.191478 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6"} err="failed to get container status \"ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6\": rpc error: code = NotFound desc = could not find container \"ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6\": container with ID starting with ccaf1e22947d9285939f28338d472b4205745cc867e809c37d4674fcc38088f6 not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.191507 4911 scope.go:117] "RemoveContainer" containerID="439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.191902 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa\": container with ID starting with 439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa not found: ID does not exist" containerID="439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.192004 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa"} err="failed to get container status \"439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa\": rpc error: code = NotFound desc = could not find container \"439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa\": container with ID starting with 439dd93a0da2fff5c95522e48b7179994988592656a272f8081375eddc2ff6fa not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.192092 4911 scope.go:117] "RemoveContainer" containerID="a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.205347 4911 scope.go:117] "RemoveContainer" containerID="10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.220945 4911 scope.go:117] "RemoveContainer" containerID="45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.232848 4911 scope.go:117] "RemoveContainer" containerID="a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.233695 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76\": container with ID starting with a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76 not found: ID does not exist" containerID="a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.233795 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76"} err="failed to get container status \"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76\": rpc error: code = NotFound desc = could not find container \"a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76\": container with ID starting with a54e99d17fdc1fb59a41ff2d10a7cef7326da285ad20ae7e88c278eb72a4ef76 not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.233897 4911 scope.go:117] "RemoveContainer" containerID="10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.234356 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656\": container with ID starting with 10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656 not found: ID does not exist" containerID="10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.234417 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656"} err="failed to get container status \"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656\": rpc error: code = NotFound desc = could not find container \"10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656\": container with ID starting with 10c8e08971ec8078c4d46c90114e0553a88f362fb9ccb271933d9135f6765656 not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.234487 4911 scope.go:117] "RemoveContainer" containerID="45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad" Dec 01 00:11:20 crc kubenswrapper[4911]: E1201 00:11:20.235089 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad\": container with ID starting with 45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad not found: ID does not exist" containerID="45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.235191 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad"} err="failed to get container status \"45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad\": rpc error: code = NotFound desc = could not find container \"45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad\": container with ID starting with 45f47638dcc7a5d2675070e6627cd5bb8e1f038aff4769fc382ea4b5ea162dad not found: ID does not exist" Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.429157 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:11:20 crc kubenswrapper[4911]: I1201 00:11:20.431978 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pntq8"] Dec 01 00:11:22 crc kubenswrapper[4911]: I1201 00:11:22.008439 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:11:22 crc kubenswrapper[4911]: I1201 00:11:22.008689 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cm29t" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="registry-server" containerID="cri-o://586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a" gracePeriod=2 Dec 01 00:11:22 crc kubenswrapper[4911]: I1201 00:11:22.159581 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" path="/var/lib/kubelet/pods/b0ebaddd-1c4a-44c5-ab95-54a174396b80/volumes" Dec 01 00:11:22 crc kubenswrapper[4911]: I1201 00:11:22.160926 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" path="/var/lib/kubelet/pods/fe55751b-c29f-4c22-a636-56c9e3232fdf/volumes" Dec 01 00:11:22 crc kubenswrapper[4911]: I1201 00:11:22.890225 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.047117 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content\") pod \"a2963529-140d-4683-bc00-448f1aa1ed51\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.047217 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjdvz\" (UniqueName: \"kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz\") pod \"a2963529-140d-4683-bc00-448f1aa1ed51\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.047269 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities\") pod \"a2963529-140d-4683-bc00-448f1aa1ed51\" (UID: \"a2963529-140d-4683-bc00-448f1aa1ed51\") " Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.048722 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities" (OuterVolumeSpecName: "utilities") pod "a2963529-140d-4683-bc00-448f1aa1ed51" (UID: "a2963529-140d-4683-bc00-448f1aa1ed51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.053663 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz" (OuterVolumeSpecName: "kube-api-access-hjdvz") pod "a2963529-140d-4683-bc00-448f1aa1ed51" (UID: "a2963529-140d-4683-bc00-448f1aa1ed51"). InnerVolumeSpecName "kube-api-access-hjdvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.132719 4911 generic.go:334] "Generic (PLEG): container finished" podID="a2963529-140d-4683-bc00-448f1aa1ed51" containerID="586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a" exitCode=0 Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.132783 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cm29t" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.132797 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerDied","Data":"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a"} Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.133205 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cm29t" event={"ID":"a2963529-140d-4683-bc00-448f1aa1ed51","Type":"ContainerDied","Data":"d95a39fcda25084eb52a52185f28b46f93ed5ca54ad78a5b6fb4c19d43679c05"} Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.133240 4911 scope.go:117] "RemoveContainer" containerID="586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.149041 4911 scope.go:117] "RemoveContainer" containerID="db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.149864 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.149896 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjdvz\" (UniqueName: \"kubernetes.io/projected/a2963529-140d-4683-bc00-448f1aa1ed51-kube-api-access-hjdvz\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:23 crc kubenswrapper[4911]: E1201 00:11:23.158754 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache]" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.172580 4911 scope.go:117] "RemoveContainer" containerID="1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.184411 4911 scope.go:117] "RemoveContainer" containerID="586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a" Dec 01 00:11:23 crc kubenswrapper[4911]: E1201 00:11:23.184860 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a\": container with ID starting with 586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a not found: ID does not exist" containerID="586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.184914 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a"} err="failed to get container status \"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a\": rpc error: code = NotFound desc = could not find container \"586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a\": container with ID starting with 586f7935dd95afabc2b56f3161fd59fb715dd39a2778eaa010f48eea15e2b21a not found: ID does not exist" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.184948 4911 scope.go:117] "RemoveContainer" containerID="db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be" Dec 01 00:11:23 crc kubenswrapper[4911]: E1201 00:11:23.185928 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be\": container with ID starting with db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be not found: ID does not exist" containerID="db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.185994 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be"} err="failed to get container status \"db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be\": rpc error: code = NotFound desc = could not find container \"db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be\": container with ID starting with db0ec584453326efbbc7fd3a03167ee5c4234593a1f24c632e88e0c348b1d0be not found: ID does not exist" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.186023 4911 scope.go:117] "RemoveContainer" containerID="1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620" Dec 01 00:11:23 crc kubenswrapper[4911]: E1201 00:11:23.186382 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620\": container with ID starting with 1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620 not found: ID does not exist" containerID="1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.186414 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620"} err="failed to get container status \"1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620\": rpc error: code = NotFound desc = could not find container \"1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620\": container with ID starting with 1f5b38f292e584163ae2de27305b8091c50649b2499d8c51e4144bdac85cf620 not found: ID does not exist" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.187941 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2963529-140d-4683-bc00-448f1aa1ed51" (UID: "a2963529-140d-4683-bc00-448f1aa1ed51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.251901 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2963529-140d-4683-bc00-448f1aa1ed51-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.460972 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:11:23 crc kubenswrapper[4911]: I1201 00:11:23.464738 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cm29t"] Dec 01 00:11:24 crc kubenswrapper[4911]: I1201 00:11:24.158701 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" path="/var/lib/kubelet/pods/a2963529-140d-4683-bc00-448f1aa1ed51/volumes" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.028149 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.973602 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974309 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974330 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974346 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974354 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974365 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974374 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974386 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974393 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974407 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974413 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974421 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974430 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="extract-content" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974447 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974453 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974475 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974482 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.974496 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974503 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="extract-utilities" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974624 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ebaddd-1c4a-44c5-ab95-54a174396b80" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974639 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe55751b-c29f-4c22-a636-56c9e3232fdf" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.974648 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2963529-140d-4683-bc00-448f1aa1ed51" containerName="registry-server" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975204 4911 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975372 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975367 4911 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975780 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5" gracePeriod=15 Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975889 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729" gracePeriod=15 Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975906 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8" gracePeriod=15 Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.975993 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2" gracePeriod=15 Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976016 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead" gracePeriod=15 Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976018 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976047 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976067 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976080 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976101 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976114 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976137 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976148 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976200 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976212 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976229 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976243 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 00:11:25 crc kubenswrapper[4911]: E1201 00:11:25.976260 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976271 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976443 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976502 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976530 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976548 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976564 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 00:11:25 crc kubenswrapper[4911]: I1201 00:11:25.976904 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.019833 4911 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47938->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.019913 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47938->192.168.126.11:17697: read: connection reset by peer" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.025768 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.135662 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.135716 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.135738 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.135953 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.136189 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.136231 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.136298 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.136358 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237607 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237718 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237782 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237814 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237847 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237895 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237939 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.237970 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238059 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238494 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238551 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238594 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238636 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238681 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238680 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.238758 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: I1201 00:11:26.312253 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:11:26 crc kubenswrapper[4911]: W1201 00:11:26.347850 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-62f411559e19b11e12fab2281ad60738cb9191261434cd73f1700c2b7825566e WatchSource:0}: Error finding container 62f411559e19b11e12fab2281ad60738cb9191261434cd73f1700c2b7825566e: Status 404 returned error can't find the container with id 62f411559e19b11e12fab2281ad60738cb9191261434cd73f1700c2b7825566e Dec 01 00:11:26 crc kubenswrapper[4911]: E1201 00:11:26.354956 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187ceef42347e4e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,LastTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.161785 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f7c14289d3043fc8946b384f7e997a925013f392a49dfc3d531c6720d374cbfb"} Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.162340 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"62f411559e19b11e12fab2281ad60738cb9191261434cd73f1700c2b7825566e"} Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.162637 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.174818 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.176234 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.177100 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead" exitCode=0 Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.177125 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729" exitCode=0 Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.177136 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8" exitCode=0 Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.177144 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2" exitCode=2 Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.177206 4911 scope.go:117] "RemoveContainer" containerID="2d01368a86bd4158eaa8a5300aef05a1d47b4e35ae3aec25663256e9f3c91bc0" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.179238 4911 generic.go:334] "Generic (PLEG): container finished" podID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" containerID="419f645f23937820d44d410218cef7477e8c0af05ec2470abf1c4cf595b0d226" exitCode=0 Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.179294 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d","Type":"ContainerDied","Data":"419f645f23937820d44d410218cef7477e8c0af05ec2470abf1c4cf595b0d226"} Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.179898 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:27 crc kubenswrapper[4911]: I1201 00:11:27.180113 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:27 crc kubenswrapper[4911]: E1201 00:11:27.932682 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187ceef42347e4e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,LastTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.187770 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.349295 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.350573 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.351439 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.352039 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.352626 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.424146 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.424576 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.424757 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.425027 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.476983 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.477060 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.477201 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.477478 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.477520 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.477537 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.578926 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock\") pod \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579034 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access\") pod \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579138 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir\") pod \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\" (UID: \"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d\") " Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579531 4911 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579559 4911 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579578 4911 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579648 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" (UID: "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.579693 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock" (OuterVolumeSpecName: "var-lock") pod "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" (UID: "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.587978 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" (UID: "629a1ea8-5e5a-44c8-948d-0991ec3e3c5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.694304 4911 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.694374 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:28 crc kubenswrapper[4911]: I1201 00:11:28.694402 4911 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/629a1ea8-5e5a-44c8-948d-0991ec3e3c5d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.204616 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"629a1ea8-5e5a-44c8-948d-0991ec3e3c5d","Type":"ContainerDied","Data":"5dd01f15084b4e6e042d2f856c8992899be6a5d8f2e86df4c588316c9a2b7e6d"} Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.204653 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd01f15084b4e6e042d2f856c8992899be6a5d8f2e86df4c588316c9a2b7e6d" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.204700 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.212914 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.213403 4911 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5" exitCode=0 Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.213478 4911 scope.go:117] "RemoveContainer" containerID="9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.213569 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.242306 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.243155 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.243423 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.243737 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.243955 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.245537 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.265319 4911 scope.go:117] "RemoveContainer" containerID="c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.285298 4911 scope.go:117] "RemoveContainer" containerID="7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.310569 4911 scope.go:117] "RemoveContainer" containerID="67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.330504 4911 scope.go:117] "RemoveContainer" containerID="d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.355513 4911 scope.go:117] "RemoveContainer" containerID="e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.378559 4911 scope.go:117] "RemoveContainer" containerID="9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.379168 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\": container with ID starting with 9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead not found: ID does not exist" containerID="9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.379217 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead"} err="failed to get container status \"9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\": rpc error: code = NotFound desc = could not find container \"9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead\": container with ID starting with 9b81413ee49684d3733223bb113a4ac1a0128054c91225f28cfdf91b2bae1ead not found: ID does not exist" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.379248 4911 scope.go:117] "RemoveContainer" containerID="c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.380095 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\": container with ID starting with c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729 not found: ID does not exist" containerID="c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.380127 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729"} err="failed to get container status \"c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\": rpc error: code = NotFound desc = could not find container \"c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729\": container with ID starting with c340520e2a08cb051415bfc4308db7e56839a2ebde4673def1485b035f1dc729 not found: ID does not exist" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.380153 4911 scope.go:117] "RemoveContainer" containerID="7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.380543 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\": container with ID starting with 7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8 not found: ID does not exist" containerID="7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.380588 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8"} err="failed to get container status \"7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\": rpc error: code = NotFound desc = could not find container \"7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8\": container with ID starting with 7e9f3c91b8ac0ec07f99a0fbf1bb0fe241c5962de03c2ec57b99717d6b9104b8 not found: ID does not exist" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.380624 4911 scope.go:117] "RemoveContainer" containerID="67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.381557 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\": container with ID starting with 67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2 not found: ID does not exist" containerID="67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.381627 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2"} err="failed to get container status \"67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\": rpc error: code = NotFound desc = could not find container \"67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2\": container with ID starting with 67966236c145d2bbb3ed746894ee57b5ae2f2704e2a5514f78ebfade0a0ff9d2 not found: ID does not exist" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.381669 4911 scope.go:117] "RemoveContainer" containerID="d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.382670 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\": container with ID starting with d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5 not found: ID does not exist" containerID="d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.382734 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5"} err="failed to get container status \"d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\": rpc error: code = NotFound desc = could not find container \"d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5\": container with ID starting with d3e12c9d622ed3ab664347c3b1c50cb03610f07eb0bc80b6bd5040019a90e0b5 not found: ID does not exist" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.382755 4911 scope.go:117] "RemoveContainer" containerID="e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f" Dec 01 00:11:29 crc kubenswrapper[4911]: E1201 00:11:29.383189 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\": container with ID starting with e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f not found: ID does not exist" containerID="e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f" Dec 01 00:11:29 crc kubenswrapper[4911]: I1201 00:11:29.383218 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f"} err="failed to get container status \"e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\": rpc error: code = NotFound desc = could not find container \"e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f\": container with ID starting with e48900986532e2bb5c053e5a23fe10af34fd49abd58ba47661e1b2f400437e7f not found: ID does not exist" Dec 01 00:11:30 crc kubenswrapper[4911]: I1201 00:11:30.157127 4911 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:30 crc kubenswrapper[4911]: I1201 00:11:30.157665 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:30 crc kubenswrapper[4911]: I1201 00:11:30.158173 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:30 crc kubenswrapper[4911]: I1201 00:11:30.164512 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 00:11:33 crc kubenswrapper[4911]: E1201 00:11:33.310219 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache]" Dec 01 00:11:34 crc kubenswrapper[4911]: I1201 00:11:34.249581 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" Dec 01 00:11:34 crc kubenswrapper[4911]: I1201 00:11:34.250933 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: I1201 00:11:34.251577 4911 status_manager.go:851] "Failed to get status for pod" podUID="27b41bdd-128a-4561-acfe-e75882799f6a" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-kc7gz\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: I1201 00:11:34.252259 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.281930 4911 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.198:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" volumeName="registry-storage" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.399762 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.400254 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.400920 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.401334 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.401802 4911 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:34 crc kubenswrapper[4911]: I1201 00:11:34.401854 4911 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.402282 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="200ms" Dec 01 00:11:34 crc kubenswrapper[4911]: E1201 00:11:34.603885 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="400ms" Dec 01 00:11:35 crc kubenswrapper[4911]: E1201 00:11:35.004777 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="800ms" Dec 01 00:11:35 crc kubenswrapper[4911]: E1201 00:11:35.806732 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="1.6s" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.150948 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.152005 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.152624 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.153159 4911 status_manager.go:851] "Failed to get status for pod" podUID="27b41bdd-128a-4561-acfe-e75882799f6a" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-kc7gz\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.164080 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.164231 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:37 crc kubenswrapper[4911]: E1201 00:11:37.164701 4911 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.165145 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:37 crc kubenswrapper[4911]: I1201 00:11:37.278769 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"db7558d7fa21a84b6020ba25ccbff419527ee205ec704e78931853dd42c02eb9"} Dec 01 00:11:37 crc kubenswrapper[4911]: E1201 00:11:37.407119 4911 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="3.2s" Dec 01 00:11:37 crc kubenswrapper[4911]: E1201 00:11:37.935297 4911 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187ceef42347e4e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,LastTimestamp:2025-12-01 00:11:26.353736934 +0000 UTC m=+246.492433735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.290983 4911 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8dde121affef5e7e6090df96d6139ba74e354eaafd95d114797b08288bd2581c" exitCode=0 Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.291033 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8dde121affef5e7e6090df96d6139ba74e354eaafd95d114797b08288bd2581c"} Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.291776 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.291832 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.292205 4911 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:38 crc kubenswrapper[4911]: E1201 00:11:38.292654 4911 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.292765 4911 status_manager.go:851] "Failed to get status for pod" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.293390 4911 status_manager.go:851] "Failed to get status for pod" podUID="27b41bdd-128a-4561-acfe-e75882799f6a" pod="openshift-image-registry/image-registry-66df7c8f76-kc7gz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-kc7gz\": dial tcp 38.102.83.198:6443: connect: connection refused" Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.789722 4911 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 00:11:38 crc kubenswrapper[4911]: I1201 00:11:38.790020 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 00:11:39 crc kubenswrapper[4911]: I1201 00:11:39.302295 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 00:11:39 crc kubenswrapper[4911]: I1201 00:11:39.302369 4911 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920" exitCode=1 Dec 01 00:11:39 crc kubenswrapper[4911]: I1201 00:11:39.302494 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920"} Dec 01 00:11:39 crc kubenswrapper[4911]: I1201 00:11:39.303175 4911 scope.go:117] "RemoveContainer" containerID="bfb14146b83585328641879f2f53e53af7aed28e662b7eb7c3b9cff6ef63c920" Dec 01 00:11:39 crc kubenswrapper[4911]: I1201 00:11:39.307421 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c145386c0a8f98926509cd2121e1653e1fa957c200b536e8fa959ad1a7218196"} Dec 01 00:11:40 crc kubenswrapper[4911]: I1201 00:11:40.317488 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 00:11:40 crc kubenswrapper[4911]: I1201 00:11:40.317984 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f257737ae0bc8cb86f5cc5acd954e84edbcc91ea4e52b62296283278034aad78"} Dec 01 00:11:40 crc kubenswrapper[4911]: I1201 00:11:40.321200 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"efcc792d17cd02196a9ca8181cd62716421533c2376a04ebb05628ded2dde8d1"} Dec 01 00:11:40 crc kubenswrapper[4911]: I1201 00:11:40.321233 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95228fabf31c0fcb6849baf025b037c4118b8fef7de3222a60ac0883b2698225"} Dec 01 00:11:41 crc kubenswrapper[4911]: I1201 00:11:41.333132 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9054023d4b457eba6240ecc30c5d14ebef21e4dc007d070cff93960df8f6674"} Dec 01 00:11:41 crc kubenswrapper[4911]: I1201 00:11:41.333183 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1f21bb3c2eccb214c53dbf5db92de4d716057bfd90609bdb3b988c35b68230c1"} Dec 01 00:11:41 crc kubenswrapper[4911]: I1201 00:11:41.333342 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:41 crc kubenswrapper[4911]: I1201 00:11:41.333508 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:41 crc kubenswrapper[4911]: I1201 00:11:41.333539 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:42 crc kubenswrapper[4911]: I1201 00:11:42.166120 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:42 crc kubenswrapper[4911]: I1201 00:11:42.166648 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:42 crc kubenswrapper[4911]: I1201 00:11:42.175556 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:42 crc kubenswrapper[4911]: I1201 00:11:42.953422 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerName="oauth-openshift" containerID="cri-o://dfd45c213e8bef5e2d17c16d2a055328d9585e7bfe9ae8285e3685b04f291f7d" gracePeriod=15 Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.350020 4911 generic.go:334] "Generic (PLEG): container finished" podID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerID="dfd45c213e8bef5e2d17c16d2a055328d9585e7bfe9ae8285e3685b04f291f7d" exitCode=0 Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.350113 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" event={"ID":"85eaed94-1314-4f16-bdf1-a598b183d97c","Type":"ContainerDied","Data":"dfd45c213e8bef5e2d17c16d2a055328d9585e7bfe9ae8285e3685b04f291f7d"} Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.420545 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432426 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432497 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432539 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432586 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432612 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432639 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djwn8\" (UniqueName: \"kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432663 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432690 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432733 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432774 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432808 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432833 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432867 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.432903 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error\") pod \"85eaed94-1314-4f16-bdf1-a598b183d97c\" (UID: \"85eaed94-1314-4f16-bdf1-a598b183d97c\") " Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.434094 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.434482 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.435312 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.436102 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.436235 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.439729 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.440225 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.440941 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8" (OuterVolumeSpecName: "kube-api-access-djwn8") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "kube-api-access-djwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.441019 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.441103 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.441319 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.441687 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.442203 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.442538 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "85eaed94-1314-4f16-bdf1-a598b183d97c" (UID: "85eaed94-1314-4f16-bdf1-a598b183d97c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:11:43 crc kubenswrapper[4911]: E1201 00:11:43.472363 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache]" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534273 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534333 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534354 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djwn8\" (UniqueName: \"kubernetes.io/projected/85eaed94-1314-4f16-bdf1-a598b183d97c-kube-api-access-djwn8\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534372 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534394 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534414 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534431 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534450 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534498 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534519 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534538 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534556 4911 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85eaed94-1314-4f16-bdf1-a598b183d97c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534574 4911 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:43 crc kubenswrapper[4911]: I1201 00:11:43.534593 4911 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85eaed94-1314-4f16-bdf1-a598b183d97c-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 00:11:44 crc kubenswrapper[4911]: I1201 00:11:44.361301 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" event={"ID":"85eaed94-1314-4f16-bdf1-a598b183d97c","Type":"ContainerDied","Data":"852d6fcbccee4a01432ffe9f472351fd63221d71a10d8aed119175670b18cc18"} Dec 01 00:11:44 crc kubenswrapper[4911]: I1201 00:11:44.361835 4911 scope.go:117] "RemoveContainer" containerID="dfd45c213e8bef5e2d17c16d2a055328d9585e7bfe9ae8285e3685b04f291f7d" Dec 01 00:11:44 crc kubenswrapper[4911]: I1201 00:11:44.361388 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9qfhm" Dec 01 00:11:46 crc kubenswrapper[4911]: I1201 00:11:46.342140 4911 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:46 crc kubenswrapper[4911]: I1201 00:11:46.376946 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:46 crc kubenswrapper[4911]: I1201 00:11:46.377342 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:46 crc kubenswrapper[4911]: I1201 00:11:46.380997 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:11:46 crc kubenswrapper[4911]: I1201 00:11:46.386074 4911 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1815f3df-ef9a-4401-b0f0-ce336fad4ab5" Dec 01 00:11:47 crc kubenswrapper[4911]: I1201 00:11:47.384122 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:47 crc kubenswrapper[4911]: I1201 00:11:47.384182 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:11:48 crc kubenswrapper[4911]: I1201 00:11:48.032879 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:11:48 crc kubenswrapper[4911]: I1201 00:11:48.037045 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:11:48 crc kubenswrapper[4911]: I1201 00:11:48.390485 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:11:49 crc kubenswrapper[4911]: I1201 00:11:49.406895 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 00:11:50 crc kubenswrapper[4911]: I1201 00:11:50.176801 4911 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1815f3df-ef9a-4401-b0f0-ce336fad4ab5" Dec 01 00:11:53 crc kubenswrapper[4911]: E1201 00:11:53.680550 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache]" Dec 01 00:11:55 crc kubenswrapper[4911]: I1201 00:11:55.133141 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 00:11:55 crc kubenswrapper[4911]: I1201 00:11:55.330490 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 00:11:56 crc kubenswrapper[4911]: I1201 00:11:56.323024 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 00:11:56 crc kubenswrapper[4911]: I1201 00:11:56.564016 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 00:11:56 crc kubenswrapper[4911]: I1201 00:11:56.744999 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 00:11:56 crc kubenswrapper[4911]: I1201 00:11:56.939643 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 00:11:57 crc kubenswrapper[4911]: I1201 00:11:57.140579 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 00:11:57 crc kubenswrapper[4911]: I1201 00:11:57.369801 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 00:11:57 crc kubenswrapper[4911]: I1201 00:11:57.373097 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 00:11:57 crc kubenswrapper[4911]: I1201 00:11:57.722186 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.106338 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.125627 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.180751 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.228281 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.275287 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.518063 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.579285 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.678123 4911 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.779382 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.810521 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 00:11:58 crc kubenswrapper[4911]: I1201 00:11:58.868371 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.060532 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.070054 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.108665 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.151555 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.154697 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.262418 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.266103 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.323076 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.354576 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.365357 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.519439 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.540261 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.765684 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.871579 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.871580 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 00:11:59 crc kubenswrapper[4911]: I1201 00:11:59.976332 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.210274 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.230847 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.251982 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.349109 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.446342 4911 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.596147 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.776836 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.922449 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 00:12:00 crc kubenswrapper[4911]: I1201 00:12:00.995114 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.008499 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.072638 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.237689 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.392944 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.451681 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.454306 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.492192 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.493960 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.516191 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.567822 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.645633 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.858315 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.907131 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 00:12:01 crc kubenswrapper[4911]: I1201 00:12:01.925011 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.068169 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.098531 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.104329 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.184589 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.321198 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.334703 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.366133 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.398067 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.449019 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.449332 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.481577 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.515749 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.584262 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.699027 4911 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.733095 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.821133 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.902222 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.922028 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 00:12:02 crc kubenswrapper[4911]: I1201 00:12:02.957142 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.078809 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.083922 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.092813 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.191672 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.250427 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.272971 4911 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.286274 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.317638 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.384285 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.441081 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.502741 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.594763 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.596136 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.713590 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.817085 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 00:12:03 crc kubenswrapper[4911]: E1201 00:12:03.855890 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache]" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.885615 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 00:12:03 crc kubenswrapper[4911]: I1201 00:12:03.953604 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.065696 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.396949 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.417371 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.532034 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.555226 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.573686 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.579352 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.580579 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.647035 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.672585 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.690659 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.738282 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.749903 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.778291 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.779531 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.840393 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 00:12:04 crc kubenswrapper[4911]: I1201 00:12:04.990763 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.076090 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.199526 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.235560 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.238310 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.266480 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.299006 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.412419 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.417890 4911 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.576899 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.754054 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.791037 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.821634 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.835079 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.876268 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 00:12:05 crc kubenswrapper[4911]: I1201 00:12:05.879152 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.020323 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.141553 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.288257 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.353303 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.353593 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.353607 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.353699 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.354599 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.364342 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.379710 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.396875 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.504181 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.550062 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.694407 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.846143 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.957341 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.969952 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 00:12:06 crc kubenswrapper[4911]: I1201 00:12:06.997196 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.040170 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.126131 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.146414 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.148360 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.199184 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.200788 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.209554 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.216184 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.318116 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.341394 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.341960 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.451903 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.528113 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.578863 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.635076 4911 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.638880 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.638864602 podStartE2EDuration="41.638864602s" podCreationTimestamp="2025-12-01 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:11:45.471449299 +0000 UTC m=+265.610146080" watchObservedRunningTime="2025-12-01 00:12:07.638864602 +0000 UTC m=+287.777561373" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639222 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9qfhm","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639266 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 00:12:07 crc kubenswrapper[4911]: E1201 00:12:07.639448 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" containerName="installer" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639481 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" containerName="installer" Dec 01 00:12:07 crc kubenswrapper[4911]: E1201 00:12:07.639503 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerName="oauth-openshift" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639511 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerName="oauth-openshift" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639706 4911 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639725 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ca553967-361c-45e2-9f78-15e5bedc7ea6" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.639979 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" containerName="oauth-openshift" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.640040 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="629a1ea8-5e5a-44c8-948d-0991ec3e3c5d" containerName="installer" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.640679 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.640870 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.644411 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.644939 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.645186 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.645553 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.645751 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.646315 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.646605 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647168 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647356 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647674 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647764 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647871 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.647688 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.662357 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.665290 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.667184 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.683697 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.683670965 podStartE2EDuration="21.683670965s" podCreationTimestamp="2025-12-01 00:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:07.67905942 +0000 UTC m=+287.817756191" watchObservedRunningTime="2025-12-01 00:12:07.683670965 +0000 UTC m=+287.822367746" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793167 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793234 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgfn\" (UniqueName: \"kubernetes.io/projected/d82554d4-8590-47f5-841f-2ee91739554c-kube-api-access-sfgfn\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793269 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793312 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d82554d4-8590-47f5-841f-2ee91739554c-audit-dir\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793349 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793504 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793551 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-error\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793584 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793610 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793642 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-login\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793663 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793682 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-audit-policies\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793763 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.793792 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-session\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.803213 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.889676 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895703 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895748 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-error\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895779 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895797 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895824 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-login\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895842 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-audit-policies\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895880 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-session\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.895898 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.896921 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-audit-policies\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897426 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897607 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgfn\" (UniqueName: \"kubernetes.io/projected/d82554d4-8590-47f5-841f-2ee91739554c-kube-api-access-sfgfn\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897634 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897658 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897747 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d82554d4-8590-47f5-841f-2ee91739554c-audit-dir\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.897816 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.898609 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.898670 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d82554d4-8590-47f5-841f-2ee91739554c-audit-dir\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.899583 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.903167 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-error\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.903362 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.911255 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.911682 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-session\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.911682 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.911931 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-template-login\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.913189 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.915213 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d82554d4-8590-47f5-841f-2ee91739554c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.918141 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgfn\" (UniqueName: \"kubernetes.io/projected/d82554d4-8590-47f5-841f-2ee91739554c-kube-api-access-sfgfn\") pod \"oauth-openshift-6fdcc7ff8c-npn2s\" (UID: \"d82554d4-8590-47f5-841f-2ee91739554c\") " pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.961641 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 00:12:07 crc kubenswrapper[4911]: I1201 00:12:07.969350 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.059172 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.084332 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.090691 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.105074 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.160825 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85eaed94-1314-4f16-bdf1-a598b183d97c" path="/var/lib/kubelet/pods/85eaed94-1314-4f16-bdf1-a598b183d97c/volumes" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.161933 4911 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.162165 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f7c14289d3043fc8946b384f7e997a925013f392a49dfc3d531c6720d374cbfb" gracePeriod=5 Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.175283 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.192432 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s"] Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.214260 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.245455 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.255348 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.308956 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.359378 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.447237 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s"] Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.494944 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.498087 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.520253 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.532150 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.543861 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" event={"ID":"d82554d4-8590-47f5-841f-2ee91739554c","Type":"ContainerStarted","Data":"62e5e98a3622d7d852fb39c49fd20bd94192b420b81fd74e4877ac064da053ed"} Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.610712 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.668965 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.719904 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.733339 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.903113 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.939334 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 00:12:08 crc kubenswrapper[4911]: I1201 00:12:08.993813 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.025730 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.044011 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.085445 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.318131 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.395400 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.461201 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.552356 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.555232 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" event={"ID":"d82554d4-8590-47f5-841f-2ee91739554c","Type":"ContainerStarted","Data":"09fb39109f57d10f8f1d422daa11d7ade66e65092b60248c438e4486734f1cfe"} Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.555710 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.579573 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.594497 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6fdcc7ff8c-npn2s" podStartSLOduration=52.594453722 podStartE2EDuration="52.594453722s" podCreationTimestamp="2025-12-01 00:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:09.59364873 +0000 UTC m=+289.732345581" watchObservedRunningTime="2025-12-01 00:12:09.594453722 +0000 UTC m=+289.733150503" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.620009 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.765629 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.780258 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.782578 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.789704 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.802235 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.827683 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 00:12:09 crc kubenswrapper[4911]: I1201 00:12:09.949937 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.034388 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.102102 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.271646 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.397752 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.527123 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.640597 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.678345 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.856626 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 00:12:10 crc kubenswrapper[4911]: I1201 00:12:10.913160 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.008709 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.040434 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.058485 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.059435 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.122069 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.288093 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.395481 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.456294 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.457815 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.460446 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.688299 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.761726 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.840680 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 00:12:11 crc kubenswrapper[4911]: I1201 00:12:11.855416 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.260430 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.261561 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.380450 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.480751 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.509596 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.579810 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.590450 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.599205 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 00:12:12 crc kubenswrapper[4911]: I1201 00:12:12.768808 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.050385 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.457358 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.587952 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.588043 4911 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f7c14289d3043fc8946b384f7e997a925013f392a49dfc3d531c6720d374cbfb" exitCode=137 Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.614777 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.727980 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.771818 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.771920 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809191 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809264 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809340 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809384 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809449 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809499 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809574 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809601 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809670 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.809995 4911 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.810027 4911 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.810051 4911 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.810076 4911 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.822288 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:12:13 crc kubenswrapper[4911]: I1201 00:12:13.911024 4911 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:13 crc kubenswrapper[4911]: E1201 00:12:13.993658 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice/crio-fe5468e76745cdd5ea04f238857188277e7b99f5f29cc186357ae796fac93fda\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe55751b_c29f_4c22_a636_56c9e3232fdf.slice\": RecentStats: unable to find data in memory cache]" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.166890 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.167371 4911 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.186092 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.186147 4911 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c9c8549e-7496-441d-8885-2a4d574341b4" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.192901 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.192958 4911 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c9c8549e-7496-441d-8885-2a4d574341b4" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.377817 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.603967 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.604084 4911 scope.go:117] "RemoveContainer" containerID="f7c14289d3043fc8946b384f7e997a925013f392a49dfc3d531c6720d374cbfb" Dec 01 00:12:14 crc kubenswrapper[4911]: I1201 00:12:14.604224 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 00:12:19 crc kubenswrapper[4911]: I1201 00:12:19.993052 4911 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.949795 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.951192 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9xb9" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="registry-server" containerID="cri-o://1eeb8fc3b6c38a0811635435e6b28e376735a0962606d2fc259a3c285b58a34f" gracePeriod=30 Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.957984 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.958230 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7k7j2" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="registry-server" containerID="cri-o://daea890e1303b126b08c8489aa3c51af3c462f1d830b7bda646c621b69a007db" gracePeriod=30 Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.976042 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:12:20 crc kubenswrapper[4911]: I1201 00:12:20.978053 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerName="marketplace-operator" containerID="cri-o://76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675" gracePeriod=30 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.004275 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.004800 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5drp7" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="registry-server" containerID="cri-o://56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80" gracePeriod=30 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.011251 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v4cgp"] Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.011578 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.011599 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.011760 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.012268 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.014809 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.015193 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ffmzr" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="registry-server" containerID="cri-o://c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c" gracePeriod=30 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.018006 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v4cgp"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.025449 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.025612 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.025665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrvb\" (UniqueName: \"kubernetes.io/projected/6f286bcc-7bb8-4571-a057-4db77eee17a6-kube-api-access-9zrvb\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.128240 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.128859 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.128929 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zrvb\" (UniqueName: \"kubernetes.io/projected/6f286bcc-7bb8-4571-a057-4db77eee17a6-kube-api-access-9zrvb\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.130083 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.137596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f286bcc-7bb8-4571-a057-4db77eee17a6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.146132 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zrvb\" (UniqueName: \"kubernetes.io/projected/6f286bcc-7bb8-4571-a057-4db77eee17a6-kube-api-access-9zrvb\") pod \"marketplace-operator-79b997595-v4cgp\" (UID: \"6f286bcc-7bb8-4571-a057-4db77eee17a6\") " pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.441012 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.454690 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.460817 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.467189 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636231 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wjbh\" (UniqueName: \"kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh\") pod \"a55ff1f6-20d7-435a-9764-59a0b24f7000\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636614 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmvs\" (UniqueName: \"kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs\") pod \"ed713643-05a3-45af-a821-b053054528dd\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636682 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content\") pod \"a55ff1f6-20d7-435a-9764-59a0b24f7000\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636721 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities\") pod \"ed713643-05a3-45af-a821-b053054528dd\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636758 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content\") pod \"ed713643-05a3-45af-a821-b053054528dd\" (UID: \"ed713643-05a3-45af-a821-b053054528dd\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636785 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hh47\" (UniqueName: \"kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47\") pod \"a4b27e6c-5803-46ae-ac80-00f249cb714c\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636824 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca\") pod \"a4b27e6c-5803-46ae-ac80-00f249cb714c\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636855 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities\") pod \"a55ff1f6-20d7-435a-9764-59a0b24f7000\" (UID: \"a55ff1f6-20d7-435a-9764-59a0b24f7000\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.636880 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics\") pod \"a4b27e6c-5803-46ae-ac80-00f249cb714c\" (UID: \"a4b27e6c-5803-46ae-ac80-00f249cb714c\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.640248 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities" (OuterVolumeSpecName: "utilities") pod "ed713643-05a3-45af-a821-b053054528dd" (UID: "ed713643-05a3-45af-a821-b053054528dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.640819 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a4b27e6c-5803-46ae-ac80-00f249cb714c" (UID: "a4b27e6c-5803-46ae-ac80-00f249cb714c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.641235 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities" (OuterVolumeSpecName: "utilities") pod "a55ff1f6-20d7-435a-9764-59a0b24f7000" (UID: "a55ff1f6-20d7-435a-9764-59a0b24f7000"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.642258 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh" (OuterVolumeSpecName: "kube-api-access-9wjbh") pod "a55ff1f6-20d7-435a-9764-59a0b24f7000" (UID: "a55ff1f6-20d7-435a-9764-59a0b24f7000"). InnerVolumeSpecName "kube-api-access-9wjbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.643905 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47" (OuterVolumeSpecName: "kube-api-access-9hh47") pod "a4b27e6c-5803-46ae-ac80-00f249cb714c" (UID: "a4b27e6c-5803-46ae-ac80-00f249cb714c"). InnerVolumeSpecName "kube-api-access-9hh47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.644934 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a4b27e6c-5803-46ae-ac80-00f249cb714c" (UID: "a4b27e6c-5803-46ae-ac80-00f249cb714c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.671647 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs" (OuterVolumeSpecName: "kube-api-access-gmmvs") pod "ed713643-05a3-45af-a821-b053054528dd" (UID: "ed713643-05a3-45af-a821-b053054528dd"). InnerVolumeSpecName "kube-api-access-gmmvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.672977 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a55ff1f6-20d7-435a-9764-59a0b24f7000" (UID: "a55ff1f6-20d7-435a-9764-59a0b24f7000"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.683714 4911 generic.go:334] "Generic (PLEG): container finished" podID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerID="76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675" exitCode=0 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.683789 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.683795 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" event={"ID":"a4b27e6c-5803-46ae-ac80-00f249cb714c","Type":"ContainerDied","Data":"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.683893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66zhz" event={"ID":"a4b27e6c-5803-46ae-ac80-00f249cb714c","Type":"ContainerDied","Data":"93c9631975f431c8ab23c92738167222dbf27ece32bee1b4614500b9b7bbdd57"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.683913 4911 scope.go:117] "RemoveContainer" containerID="76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.689355 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerID="1eeb8fc3b6c38a0811635435e6b28e376735a0962606d2fc259a3c285b58a34f" exitCode=0 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.689412 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerDied","Data":"1eeb8fc3b6c38a0811635435e6b28e376735a0962606d2fc259a3c285b58a34f"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.712362 4911 generic.go:334] "Generic (PLEG): container finished" podID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerID="56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80" exitCode=0 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.712493 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerDied","Data":"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.712534 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drp7" event={"ID":"a55ff1f6-20d7-435a-9764-59a0b24f7000","Type":"ContainerDied","Data":"4e16867a0d18af888a66e57c5f8e67ab50526e028c9d5a3e32df6c4b83f740d0"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.712534 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drp7" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.718226 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.719241 4911 generic.go:334] "Generic (PLEG): container finished" podID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerID="daea890e1303b126b08c8489aa3c51af3c462f1d830b7bda646c621b69a007db" exitCode=0 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.719324 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerDied","Data":"daea890e1303b126b08c8489aa3c51af3c462f1d830b7bda646c621b69a007db"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.722571 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66zhz"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.724426 4911 scope.go:117] "RemoveContainer" containerID="76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.724833 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675\": container with ID starting with 76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675 not found: ID does not exist" containerID="76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.724858 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675"} err="failed to get container status \"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675\": rpc error: code = NotFound desc = could not find container \"76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675\": container with ID starting with 76207d0a88fcc1559ed4a6eead9645e85045137a510e027dae29d107fceae675 not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.724876 4911 scope.go:117] "RemoveContainer" containerID="56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.728017 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v4cgp"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.728652 4911 generic.go:334] "Generic (PLEG): container finished" podID="ed713643-05a3-45af-a821-b053054528dd" containerID="c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c" exitCode=0 Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.728702 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerDied","Data":"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.728726 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffmzr" event={"ID":"ed713643-05a3-45af-a821-b053054528dd","Type":"ContainerDied","Data":"909e66524f4aa7cf79141068f5d404fb0703c6164688c88358f7cb389acb2ac2"} Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.728805 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffmzr" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739216 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wjbh\" (UniqueName: \"kubernetes.io/projected/a55ff1f6-20d7-435a-9764-59a0b24f7000-kube-api-access-9wjbh\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739254 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmvs\" (UniqueName: \"kubernetes.io/projected/ed713643-05a3-45af-a821-b053054528dd-kube-api-access-gmmvs\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739267 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739280 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739297 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hh47\" (UniqueName: \"kubernetes.io/projected/a4b27e6c-5803-46ae-ac80-00f249cb714c-kube-api-access-9hh47\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739337 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739347 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a55ff1f6-20d7-435a-9764-59a0b24f7000-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.739357 4911 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4b27e6c-5803-46ae-ac80-00f249cb714c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.744772 4911 scope.go:117] "RemoveContainer" containerID="92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.753863 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.765546 4911 scope.go:117] "RemoveContainer" containerID="d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.767787 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drp7"] Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.787740 4911 scope.go:117] "RemoveContainer" containerID="56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.788235 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80\": container with ID starting with 56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80 not found: ID does not exist" containerID="56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.788265 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80"} err="failed to get container status \"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80\": rpc error: code = NotFound desc = could not find container \"56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80\": container with ID starting with 56152db080ff3baec8a660d9f1f524b1275f154caad6fe5219cfa9fb0c6f8b80 not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.788293 4911 scope.go:117] "RemoveContainer" containerID="92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.789380 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98\": container with ID starting with 92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98 not found: ID does not exist" containerID="92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.789473 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98"} err="failed to get container status \"92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98\": rpc error: code = NotFound desc = could not find container \"92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98\": container with ID starting with 92056b71e5ec97f101754ec7f9d7910ad344785b0a3481679e3d51405ca89d98 not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.789533 4911 scope.go:117] "RemoveContainer" containerID="d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.790199 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b\": container with ID starting with d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b not found: ID does not exist" containerID="d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.790235 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b"} err="failed to get container status \"d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b\": rpc error: code = NotFound desc = could not find container \"d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b\": container with ID starting with d12c38707ca113d71d139a98075e81abbe6337547f8a85c383ad26c646ca5d9b not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.790256 4911 scope.go:117] "RemoveContainer" containerID="c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.801936 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed713643-05a3-45af-a821-b053054528dd" (UID: "ed713643-05a3-45af-a821-b053054528dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.807746 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.814709 4911 scope.go:117] "RemoveContainer" containerID="0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.840939 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed713643-05a3-45af-a821-b053054528dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.842711 4911 scope.go:117] "RemoveContainer" containerID="e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.860317 4911 scope.go:117] "RemoveContainer" containerID="c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.860839 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c\": container with ID starting with c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c not found: ID does not exist" containerID="c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.860882 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c"} err="failed to get container status \"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c\": rpc error: code = NotFound desc = could not find container \"c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c\": container with ID starting with c6b55adb9c652be130e0c0b25f300eb1e7611a863b8d6eace35b7c631b5b521c not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.860913 4911 scope.go:117] "RemoveContainer" containerID="0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.861336 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7\": container with ID starting with 0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7 not found: ID does not exist" containerID="0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.861412 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7"} err="failed to get container status \"0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7\": rpc error: code = NotFound desc = could not find container \"0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7\": container with ID starting with 0d268e7ccd58db89f1114c98808c69c6fc2721754f9b1ff3ae374d3baf8d09a7 not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.861490 4911 scope.go:117] "RemoveContainer" containerID="e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08" Dec 01 00:12:21 crc kubenswrapper[4911]: E1201 00:12:21.861881 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08\": container with ID starting with e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08 not found: ID does not exist" containerID="e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.861903 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08"} err="failed to get container status \"e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08\": rpc error: code = NotFound desc = could not find container \"e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08\": container with ID starting with e1991c02b00495ff9e8c765464bcb6ab38073b7eeff9606dee8525ff1d119d08 not found: ID does not exist" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.870217 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.941678 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6l6c\" (UniqueName: \"kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c\") pod \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.941798 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content\") pod \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.941853 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities\") pod \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\" (UID: \"9a33691a-6c8a-47ac-9d8a-cce2a68425e7\") " Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.942744 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities" (OuterVolumeSpecName: "utilities") pod "9a33691a-6c8a-47ac-9d8a-cce2a68425e7" (UID: "9a33691a-6c8a-47ac-9d8a-cce2a68425e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:21 crc kubenswrapper[4911]: I1201 00:12:21.948901 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c" (OuterVolumeSpecName: "kube-api-access-g6l6c") pod "9a33691a-6c8a-47ac-9d8a-cce2a68425e7" (UID: "9a33691a-6c8a-47ac-9d8a-cce2a68425e7"). InnerVolumeSpecName "kube-api-access-g6l6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.002547 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a33691a-6c8a-47ac-9d8a-cce2a68425e7" (UID: "9a33691a-6c8a-47ac-9d8a-cce2a68425e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042485 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities\") pod \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042569 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content\") pod \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042631 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h82kq\" (UniqueName: \"kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq\") pod \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\" (UID: \"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209\") " Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042885 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042900 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.042911 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6l6c\" (UniqueName: \"kubernetes.io/projected/9a33691a-6c8a-47ac-9d8a-cce2a68425e7-kube-api-access-g6l6c\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.043762 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities" (OuterVolumeSpecName: "utilities") pod "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" (UID: "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.045810 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq" (OuterVolumeSpecName: "kube-api-access-h82kq") pod "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" (UID: "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209"). InnerVolumeSpecName "kube-api-access-h82kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.054755 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.057945 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ffmzr"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.086611 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" (UID: "5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.143756 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.143795 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.143809 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h82kq\" (UniqueName: \"kubernetes.io/projected/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209-kube-api-access-h82kq\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.159906 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" path="/var/lib/kubelet/pods/a4b27e6c-5803-46ae-ac80-00f249cb714c/volumes" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.160649 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" path="/var/lib/kubelet/pods/a55ff1f6-20d7-435a-9764-59a0b24f7000/volumes" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.161532 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed713643-05a3-45af-a821-b053054528dd" path="/var/lib/kubelet/pods/ed713643-05a3-45af-a821-b053054528dd/volumes" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.742325 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9xb9" event={"ID":"9a33691a-6c8a-47ac-9d8a-cce2a68425e7","Type":"ContainerDied","Data":"be9bb614e9fc64cd82cc3ae81843e2fa9a31a47a5d99383321241d6e7c819914"} Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.742384 4911 scope.go:117] "RemoveContainer" containerID="1eeb8fc3b6c38a0811635435e6b28e376735a0962606d2fc259a3c285b58a34f" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.742409 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9xb9" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.748171 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k7j2" event={"ID":"5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209","Type":"ContainerDied","Data":"85fc4ccccdb0542b067e66b77ca56a620eb7363260e1029201f8ce60a6616d03"} Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.748305 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k7j2" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.758144 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" event={"ID":"6f286bcc-7bb8-4571-a057-4db77eee17a6","Type":"ContainerStarted","Data":"df0527cc298cc7c5f1275e284ad5b906b64266a9e79d2109885cbda0faa17420"} Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.758208 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" event={"ID":"6f286bcc-7bb8-4571-a057-4db77eee17a6","Type":"ContainerStarted","Data":"cf66c27532909a430c4283477510d1327d19ad51fab6a06579d21416121467b6"} Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.758442 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.763491 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.778606 4911 scope.go:117] "RemoveContainer" containerID="240b25678cb2dcd751580908e3275fdb0e9ac474a36027a15b161df9d288dfc7" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.779813 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.792209 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9xb9"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.792789 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-v4cgp" podStartSLOduration=2.792757299 podStartE2EDuration="2.792757299s" podCreationTimestamp="2025-12-01 00:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:22.788034341 +0000 UTC m=+302.926731152" watchObservedRunningTime="2025-12-01 00:12:22.792757299 +0000 UTC m=+302.931454110" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.818330 4911 scope.go:117] "RemoveContainer" containerID="aebfb7e3291fb2fe1d4d6f6430f106a12796b89337cd97706899ac7445bd2253" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.821727 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.831076 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7k7j2"] Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.851070 4911 scope.go:117] "RemoveContainer" containerID="daea890e1303b126b08c8489aa3c51af3c462f1d830b7bda646c621b69a007db" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.868264 4911 scope.go:117] "RemoveContainer" containerID="d851babc3fadf86150c231e2fbe63b0bf1c99f01faf643bbef54670474cf1672" Dec 01 00:12:22 crc kubenswrapper[4911]: I1201 00:12:22.888598 4911 scope.go:117] "RemoveContainer" containerID="8656b9465f538d0005434d71581f070b621473bfa8e12ec2194d58ed58ad433f" Dec 01 00:12:24 crc kubenswrapper[4911]: I1201 00:12:24.177135 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" path="/var/lib/kubelet/pods/5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209/volumes" Dec 01 00:12:24 crc kubenswrapper[4911]: I1201 00:12:24.180241 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" path="/var/lib/kubelet/pods/9a33691a-6c8a-47ac-9d8a-cce2a68425e7/volumes" Dec 01 00:12:32 crc kubenswrapper[4911]: I1201 00:12:32.700590 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" podUID="b2bca0c5-b712-4648-a9a8-34543b89d5db" containerName="registry" containerID="cri-o://bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173" gracePeriod=30 Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.101726 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.203348 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.203757 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.203839 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.203898 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.203984 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2hm7\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.204050 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.204109 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.204155 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls\") pod \"b2bca0c5-b712-4648-a9a8-34543b89d5db\" (UID: \"b2bca0c5-b712-4648-a9a8-34543b89d5db\") " Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.204697 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.204716 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.212030 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.213277 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.215814 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.216563 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7" (OuterVolumeSpecName: "kube-api-access-q2hm7") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "kube-api-access-q2hm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.216939 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.230166 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b2bca0c5-b712-4648-a9a8-34543b89d5db" (UID: "b2bca0c5-b712-4648-a9a8-34543b89d5db"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306605 4911 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306692 4911 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2bca0c5-b712-4648-a9a8-34543b89d5db-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306706 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2hm7\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-kube-api-access-q2hm7\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306718 4911 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306731 4911 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2bca0c5-b712-4648-a9a8-34543b89d5db-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306745 4911 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2bca0c5-b712-4648-a9a8-34543b89d5db-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.306757 4911 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2bca0c5-b712-4648-a9a8-34543b89d5db-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.828322 4911 generic.go:334] "Generic (PLEG): container finished" podID="b2bca0c5-b712-4648-a9a8-34543b89d5db" containerID="bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173" exitCode=0 Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.828645 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" event={"ID":"b2bca0c5-b712-4648-a9a8-34543b89d5db","Type":"ContainerDied","Data":"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173"} Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.828696 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" event={"ID":"b2bca0c5-b712-4648-a9a8-34543b89d5db","Type":"ContainerDied","Data":"da54cf183ded94c80f50e62fcbde1af8279e3335f2f9568b845c7ba0406241f7"} Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.828712 4911 scope.go:117] "RemoveContainer" containerID="bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.828837 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9hgb" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.857637 4911 scope.go:117] "RemoveContainer" containerID="bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173" Dec 01 00:12:33 crc kubenswrapper[4911]: E1201 00:12:33.858440 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173\": container with ID starting with bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173 not found: ID does not exist" containerID="bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.858554 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173"} err="failed to get container status \"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173\": rpc error: code = NotFound desc = could not find container \"bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173\": container with ID starting with bbfa9bc76287e2614e8ef62a3aa27b7b5a817f5f6f073e2637e1189452e37173 not found: ID does not exist" Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.875562 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:12:33 crc kubenswrapper[4911]: I1201 00:12:33.879968 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9hgb"] Dec 01 00:12:34 crc kubenswrapper[4911]: I1201 00:12:34.161986 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bca0c5-b712-4648-a9a8-34543b89d5db" path="/var/lib/kubelet/pods/b2bca0c5-b712-4648-a9a8-34543b89d5db/volumes" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.107670 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.108744 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" podUID="dd16121c-421a-4466-8cf9-75c9c77e461a" containerName="controller-manager" containerID="cri-o://94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415" gracePeriod=30 Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.210792 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.211008 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" podUID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" containerName="route-controller-manager" containerID="cri-o://12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09" gracePeriod=30 Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.511367 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.538227 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7gzm\" (UniqueName: \"kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm\") pod \"dd16121c-421a-4466-8cf9-75c9c77e461a\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.538358 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config\") pod \"dd16121c-421a-4466-8cf9-75c9c77e461a\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.538386 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca\") pod \"dd16121c-421a-4466-8cf9-75c9c77e461a\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.538421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles\") pod \"dd16121c-421a-4466-8cf9-75c9c77e461a\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.538448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert\") pod \"dd16121c-421a-4466-8cf9-75c9c77e461a\" (UID: \"dd16121c-421a-4466-8cf9-75c9c77e461a\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.541836 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd16121c-421a-4466-8cf9-75c9c77e461a" (UID: "dd16121c-421a-4466-8cf9-75c9c77e461a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.541859 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dd16121c-421a-4466-8cf9-75c9c77e461a" (UID: "dd16121c-421a-4466-8cf9-75c9c77e461a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.541898 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config" (OuterVolumeSpecName: "config") pod "dd16121c-421a-4466-8cf9-75c9c77e461a" (UID: "dd16121c-421a-4466-8cf9-75c9c77e461a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.546042 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd16121c-421a-4466-8cf9-75c9c77e461a" (UID: "dd16121c-421a-4466-8cf9-75c9c77e461a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.549834 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm" (OuterVolumeSpecName: "kube-api-access-p7gzm") pod "dd16121c-421a-4466-8cf9-75c9c77e461a" (UID: "dd16121c-421a-4466-8cf9-75c9c77e461a"). InnerVolumeSpecName "kube-api-access-p7gzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.564995 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.639800 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config\") pod \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.639917 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgq6n\" (UniqueName: \"kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n\") pod \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.639955 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert\") pod \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.639991 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca\") pod \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\" (UID: \"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e\") " Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.640266 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.640291 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.640303 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd16121c-421a-4466-8cf9-75c9c77e461a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.640316 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd16121c-421a-4466-8cf9-75c9c77e461a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.640331 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7gzm\" (UniqueName: \"kubernetes.io/projected/dd16121c-421a-4466-8cf9-75c9c77e461a-kube-api-access-p7gzm\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.641089 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" (UID: "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.641137 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config" (OuterVolumeSpecName: "config") pod "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" (UID: "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.644415 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n" (OuterVolumeSpecName: "kube-api-access-kgq6n") pod "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" (UID: "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e"). InnerVolumeSpecName "kube-api-access-kgq6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.644733 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" (UID: "3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.742343 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.742409 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgq6n\" (UniqueName: \"kubernetes.io/projected/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-kube-api-access-kgq6n\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.742427 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.742443 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.892112 4911 generic.go:334] "Generic (PLEG): container finished" podID="dd16121c-421a-4466-8cf9-75c9c77e461a" containerID="94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415" exitCode=0 Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.892187 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.892209 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" event={"ID":"dd16121c-421a-4466-8cf9-75c9c77e461a","Type":"ContainerDied","Data":"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415"} Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.892249 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ds2z5" event={"ID":"dd16121c-421a-4466-8cf9-75c9c77e461a","Type":"ContainerDied","Data":"b9bf4775a866bf33d06556c59526e6e8a7c636ddfcdc315028897ece28c4be55"} Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.892268 4911 scope.go:117] "RemoveContainer" containerID="94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.894670 4911 generic.go:334] "Generic (PLEG): container finished" podID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" containerID="12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09" exitCode=0 Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.894732 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" event={"ID":"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e","Type":"ContainerDied","Data":"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09"} Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.894749 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" event={"ID":"3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e","Type":"ContainerDied","Data":"2a1b35915b28c5d7230d93ad5465e2e17d2090b546893f6e4e3fb394934d04ac"} Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.894808 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.910931 4911 scope.go:117] "RemoveContainer" containerID="94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415" Dec 01 00:12:42 crc kubenswrapper[4911]: E1201 00:12:42.911622 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415\": container with ID starting with 94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415 not found: ID does not exist" containerID="94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.911664 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415"} err="failed to get container status \"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415\": rpc error: code = NotFound desc = could not find container \"94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415\": container with ID starting with 94ef88428911af0aa2c941f2a82e51e22b64e73a988b651124858d84b6497415 not found: ID does not exist" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.911688 4911 scope.go:117] "RemoveContainer" containerID="12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.928120 4911 scope.go:117] "RemoveContainer" containerID="12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09" Dec 01 00:12:42 crc kubenswrapper[4911]: E1201 00:12:42.932640 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09\": container with ID starting with 12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09 not found: ID does not exist" containerID="12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.932699 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09"} err="failed to get container status \"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09\": rpc error: code = NotFound desc = could not find container \"12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09\": container with ID starting with 12afd1820ade6e7ca41322deb45ca3d6150ac219c40dabaf8682580c226b6c09 not found: ID does not exist" Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.937759 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.944557 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ds2z5"] Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.949008 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:12:42 crc kubenswrapper[4911]: I1201 00:12:42.952334 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6nqdq"] Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269096 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269781 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269801 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269817 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerName="marketplace-operator" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269826 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerName="marketplace-operator" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269842 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269849 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269860 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269867 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269879 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269887 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269895 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269903 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269914 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269920 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269929 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269936 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="extract-utilities" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269947 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269953 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269962 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bca0c5-b712-4648-a9a8-34543b89d5db" containerName="registry" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269969 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bca0c5-b712-4648-a9a8-34543b89d5db" containerName="registry" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269979 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.269986 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.269994 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270002 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.270010 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" containerName="route-controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270017 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" containerName="route-controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.270027 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270034 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.270044 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd16121c-421a-4466-8cf9-75c9c77e461a" containerName="controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270050 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd16121c-421a-4466-8cf9-75c9c77e461a" containerName="controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: E1201 00:12:43.270058 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270064 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="extract-content" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270167 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bca0c5-b712-4648-a9a8-34543b89d5db" containerName="registry" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270179 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" containerName="route-controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270190 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd16121c-421a-4466-8cf9-75c9c77e461a" containerName="controller-manager" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270199 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a33691a-6c8a-47ac-9d8a-cce2a68425e7" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270211 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a04cb5f-c090-44dc-9a3e-b3e3b5ae9209" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270218 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55ff1f6-20d7-435a-9764-59a0b24f7000" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270226 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed713643-05a3-45af-a821-b053054528dd" containerName="registry-server" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270233 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b27e6c-5803-46ae-ac80-00f249cb714c" containerName="marketplace-operator" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.270840 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.276933 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.277411 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.278301 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.278501 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.278444 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.278997 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.283900 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.285197 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.287194 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.287325 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.287530 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.287805 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.287895 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.288011 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.295337 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.297101 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.310226 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.465810 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2ld6\" (UniqueName: \"kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.465866 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.465900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5bt\" (UniqueName: \"kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.465931 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.466048 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.466078 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.466102 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.466166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.466194 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.567962 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568042 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568093 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568127 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568160 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568217 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2ld6\" (UniqueName: \"kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568247 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568273 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5bt\" (UniqueName: \"kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.568308 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.569747 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.570245 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.570528 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.570771 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.574182 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.584712 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.584770 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.588890 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2ld6\" (UniqueName: \"kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6\") pod \"controller-manager-6b64cbf5d9-27kzj\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.591092 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5bt\" (UniqueName: \"kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt\") pod \"route-controller-manager-75c8d44cbc-st2t5\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.610015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.625232 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.876922 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:43 crc kubenswrapper[4911]: W1201 00:12:43.886056 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256904ce_0e29_46bf_8c4c_73aba1158778.slice/crio-aa1952f540f38721acff7a7b029dccd1f046f965380655e3714d523026777fe2 WatchSource:0}: Error finding container aa1952f540f38721acff7a7b029dccd1f046f965380655e3714d523026777fe2: Status 404 returned error can't find the container with id aa1952f540f38721acff7a7b029dccd1f046f965380655e3714d523026777fe2 Dec 01 00:12:43 crc kubenswrapper[4911]: I1201 00:12:43.903413 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" event={"ID":"256904ce-0e29-46bf-8c4c-73aba1158778","Type":"ContainerStarted","Data":"aa1952f540f38721acff7a7b029dccd1f046f965380655e3714d523026777fe2"} Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.023854 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:44 crc kubenswrapper[4911]: W1201 00:12:44.032041 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d16b4e_0492_4e40_a643_79ee7ba5aa24.slice/crio-b6ccee2e7ef9b323ed18ee4e610a7d104b8bef2ef88f327e7c2f820da7798069 WatchSource:0}: Error finding container b6ccee2e7ef9b323ed18ee4e610a7d104b8bef2ef88f327e7c2f820da7798069: Status 404 returned error can't find the container with id b6ccee2e7ef9b323ed18ee4e610a7d104b8bef2ef88f327e7c2f820da7798069 Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.161164 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e" path="/var/lib/kubelet/pods/3d5eb7fe-2cf6-4857-8dcc-f3ee74be096e/volumes" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.162542 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd16121c-421a-4466-8cf9-75c9c77e461a" path="/var/lib/kubelet/pods/dd16121c-421a-4466-8cf9-75c9c77e461a/volumes" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.911936 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" event={"ID":"256904ce-0e29-46bf-8c4c-73aba1158778","Type":"ContainerStarted","Data":"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc"} Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.912447 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.913424 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" event={"ID":"a8d16b4e-0492-4e40-a643-79ee7ba5aa24","Type":"ContainerStarted","Data":"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c"} Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.913477 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" event={"ID":"a8d16b4e-0492-4e40-a643-79ee7ba5aa24","Type":"ContainerStarted","Data":"b6ccee2e7ef9b323ed18ee4e610a7d104b8bef2ef88f327e7c2f820da7798069"} Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.913641 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.916786 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.920711 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.936153 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" podStartSLOduration=2.93612412 podStartE2EDuration="2.93612412s" podCreationTimestamp="2025-12-01 00:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:44.934922887 +0000 UTC m=+325.073619678" watchObservedRunningTime="2025-12-01 00:12:44.93612412 +0000 UTC m=+325.074820901" Dec 01 00:12:44 crc kubenswrapper[4911]: I1201 00:12:44.987995 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" podStartSLOduration=2.987965413 podStartE2EDuration="2.987965413s" podCreationTimestamp="2025-12-01 00:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:44.982755062 +0000 UTC m=+325.121451823" watchObservedRunningTime="2025-12-01 00:12:44.987965413 +0000 UTC m=+325.126662184" Dec 01 00:12:45 crc kubenswrapper[4911]: I1201 00:12:45.464806 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:45 crc kubenswrapper[4911]: I1201 00:12:45.475790 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:46 crc kubenswrapper[4911]: I1201 00:12:46.927673 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" podUID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" containerName="route-controller-manager" containerID="cri-o://377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c" gracePeriod=30 Dec 01 00:12:46 crc kubenswrapper[4911]: I1201 00:12:46.927810 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" podUID="256904ce-0e29-46bf-8c4c-73aba1158778" containerName="controller-manager" containerID="cri-o://3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc" gracePeriod=30 Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.448722 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.457942 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.498923 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:12:47 crc kubenswrapper[4911]: E1201 00:12:47.499269 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" containerName="route-controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.499293 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" containerName="route-controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: E1201 00:12:47.499318 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256904ce-0e29-46bf-8c4c-73aba1158778" containerName="controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.499329 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="256904ce-0e29-46bf-8c4c-73aba1158778" containerName="controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.499518 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" containerName="route-controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.499540 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="256904ce-0e29-46bf-8c4c-73aba1158778" containerName="controller-manager" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.500059 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.511608 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.632648 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert\") pod \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633233 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5bt\" (UniqueName: \"kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt\") pod \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633291 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config\") pod \"256904ce-0e29-46bf-8c4c-73aba1158778\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633347 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert\") pod \"256904ce-0e29-46bf-8c4c-73aba1158778\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633413 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca\") pod \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2ld6\" (UniqueName: \"kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6\") pod \"256904ce-0e29-46bf-8c4c-73aba1158778\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633510 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca\") pod \"256904ce-0e29-46bf-8c4c-73aba1158778\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633575 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles\") pod \"256904ce-0e29-46bf-8c4c-73aba1158778\" (UID: \"256904ce-0e29-46bf-8c4c-73aba1158778\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633609 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config\") pod \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\" (UID: \"a8d16b4e-0492-4e40-a643-79ee7ba5aa24\") " Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633736 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkz8d\" (UniqueName: \"kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633836 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633913 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.633978 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.634721 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8d16b4e-0492-4e40-a643-79ee7ba5aa24" (UID: "a8d16b4e-0492-4e40-a643-79ee7ba5aa24"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.634845 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config" (OuterVolumeSpecName: "config") pod "256904ce-0e29-46bf-8c4c-73aba1158778" (UID: "256904ce-0e29-46bf-8c4c-73aba1158778"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.634917 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca" (OuterVolumeSpecName: "client-ca") pod "256904ce-0e29-46bf-8c4c-73aba1158778" (UID: "256904ce-0e29-46bf-8c4c-73aba1158778"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.634985 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "256904ce-0e29-46bf-8c4c-73aba1158778" (UID: "256904ce-0e29-46bf-8c4c-73aba1158778"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.635207 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config" (OuterVolumeSpecName: "config") pod "a8d16b4e-0492-4e40-a643-79ee7ba5aa24" (UID: "a8d16b4e-0492-4e40-a643-79ee7ba5aa24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.641222 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8d16b4e-0492-4e40-a643-79ee7ba5aa24" (UID: "a8d16b4e-0492-4e40-a643-79ee7ba5aa24"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.642278 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "256904ce-0e29-46bf-8c4c-73aba1158778" (UID: "256904ce-0e29-46bf-8c4c-73aba1158778"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.642841 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6" (OuterVolumeSpecName: "kube-api-access-x2ld6") pod "256904ce-0e29-46bf-8c4c-73aba1158778" (UID: "256904ce-0e29-46bf-8c4c-73aba1158778"). InnerVolumeSpecName "kube-api-access-x2ld6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.647360 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt" (OuterVolumeSpecName: "kube-api-access-2m5bt") pod "a8d16b4e-0492-4e40-a643-79ee7ba5aa24" (UID: "a8d16b4e-0492-4e40-a643-79ee7ba5aa24"). InnerVolumeSpecName "kube-api-access-2m5bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.734989 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkz8d\" (UniqueName: \"kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735059 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735113 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735148 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735189 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735243 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735264 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/256904ce-0e29-46bf-8c4c-73aba1158778-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735490 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735749 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2ld6\" (UniqueName: \"kubernetes.io/projected/256904ce-0e29-46bf-8c4c-73aba1158778-kube-api-access-x2ld6\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735865 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735899 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/256904ce-0e29-46bf-8c4c-73aba1158778-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735928 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735961 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.735990 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5bt\" (UniqueName: \"kubernetes.io/projected/a8d16b4e-0492-4e40-a643-79ee7ba5aa24-kube-api-access-2m5bt\") on node \"crc\" DevicePath \"\"" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.738491 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.739365 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.740705 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.743718 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.756692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkz8d\" (UniqueName: \"kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d\") pod \"controller-manager-fb864b4d-lvd6t\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.836291 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.942139 4911 generic.go:334] "Generic (PLEG): container finished" podID="256904ce-0e29-46bf-8c4c-73aba1158778" containerID="3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc" exitCode=0 Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.942313 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" event={"ID":"256904ce-0e29-46bf-8c4c-73aba1158778","Type":"ContainerDied","Data":"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc"} Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.942337 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.942374 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj" event={"ID":"256904ce-0e29-46bf-8c4c-73aba1158778","Type":"ContainerDied","Data":"aa1952f540f38721acff7a7b029dccd1f046f965380655e3714d523026777fe2"} Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.942420 4911 scope.go:117] "RemoveContainer" containerID="3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.952401 4911 generic.go:334] "Generic (PLEG): container finished" podID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" containerID="377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c" exitCode=0 Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.952494 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" event={"ID":"a8d16b4e-0492-4e40-a643-79ee7ba5aa24","Type":"ContainerDied","Data":"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c"} Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.952509 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.952542 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5" event={"ID":"a8d16b4e-0492-4e40-a643-79ee7ba5aa24","Type":"ContainerDied","Data":"b6ccee2e7ef9b323ed18ee4e610a7d104b8bef2ef88f327e7c2f820da7798069"} Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.983641 4911 scope.go:117] "RemoveContainer" containerID="3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc" Dec 01 00:12:47 crc kubenswrapper[4911]: E1201 00:12:47.984435 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc\": container with ID starting with 3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc not found: ID does not exist" containerID="3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.984503 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc"} err="failed to get container status \"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc\": rpc error: code = NotFound desc = could not find container \"3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc\": container with ID starting with 3cb7fb2bc41d2cd1f8f284d53bf0dea28cd3c7f41cd719ce59af120ff00c13fc not found: ID does not exist" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.984528 4911 scope.go:117] "RemoveContainer" containerID="377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c" Dec 01 00:12:47 crc kubenswrapper[4911]: I1201 00:12:47.994810 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.000955 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b64cbf5d9-27kzj"] Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.011144 4911 scope.go:117] "RemoveContainer" containerID="377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c" Dec 01 00:12:48 crc kubenswrapper[4911]: E1201 00:12:48.015214 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c\": container with ID starting with 377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c not found: ID does not exist" containerID="377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.015290 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c"} err="failed to get container status \"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c\": rpc error: code = NotFound desc = could not find container \"377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c\": container with ID starting with 377dfbd6492e280bef99f75c17de5fe691f9039a64f20bc70f4a4a577ff4ee0c not found: ID does not exist" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.026128 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.034870 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c8d44cbc-st2t5"] Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.116924 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.158126 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256904ce-0e29-46bf-8c4c-73aba1158778" path="/var/lib/kubelet/pods/256904ce-0e29-46bf-8c4c-73aba1158778/volumes" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.158982 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d16b4e-0492-4e40-a643-79ee7ba5aa24" path="/var/lib/kubelet/pods/a8d16b4e-0492-4e40-a643-79ee7ba5aa24/volumes" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.972315 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" event={"ID":"262f9eea-7a20-478e-a1a3-5a70fa377aef","Type":"ContainerStarted","Data":"5e64fbad11b2fef108e385f3d4e1e9e9290ea799f58cffc5b3726a335854b3db"} Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.972987 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" event={"ID":"262f9eea-7a20-478e-a1a3-5a70fa377aef","Type":"ContainerStarted","Data":"06b234c5a332d57846dc6abfad2011f224a083e14cfc4f9049016e955afe5dec"} Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.973865 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.978785 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:12:48 crc kubenswrapper[4911]: I1201 00:12:48.993478 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" podStartSLOduration=3.99344411 podStartE2EDuration="3.99344411s" podCreationTimestamp="2025-12-01 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:48.988981569 +0000 UTC m=+329.127678340" watchObservedRunningTime="2025-12-01 00:12:48.99344411 +0000 UTC m=+329.132140881" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.274605 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.276948 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.282945 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.283092 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.283271 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.282949 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.283430 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.283771 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.298749 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.369834 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.369900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.369949 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.370080 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhff\" (UniqueName: \"kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.472333 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.473969 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.474072 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.474140 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.474283 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhff\" (UniqueName: \"kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.475939 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.487297 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.503953 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhff\" (UniqueName: \"kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff\") pod \"route-controller-manager-5dcdbd9666-hqcm2\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:50 crc kubenswrapper[4911]: I1201 00:12:50.606194 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:51 crc kubenswrapper[4911]: I1201 00:12:51.073659 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:12:51 crc kubenswrapper[4911]: I1201 00:12:51.992481 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" event={"ID":"700ef04b-517a-40c1-964a-f935e19785c0","Type":"ContainerStarted","Data":"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b"} Dec 01 00:12:51 crc kubenswrapper[4911]: I1201 00:12:51.992914 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" event={"ID":"700ef04b-517a-40c1-964a-f935e19785c0","Type":"ContainerStarted","Data":"84e1642657520ee25fe13ca86198198162b0bca06bd8cc62d8689c7f5c30c806"} Dec 01 00:12:51 crc kubenswrapper[4911]: I1201 00:12:51.992940 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:52 crc kubenswrapper[4911]: I1201 00:12:52.001648 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:12:52 crc kubenswrapper[4911]: I1201 00:12:52.016917 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" podStartSLOduration=7.016887281 podStartE2EDuration="7.016887281s" podCreationTimestamp="2025-12-01 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:12:52.014699291 +0000 UTC m=+332.153396082" watchObservedRunningTime="2025-12-01 00:12:52.016887281 +0000 UTC m=+332.155584052" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.735599 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.737355 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.740754 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.754043 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.837601 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8p6\" (UniqueName: \"kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.838182 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.838283 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.933189 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8gsmv"] Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.935730 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.938853 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gsmv"] Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.939419 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.939518 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.939642 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8p6\" (UniqueName: \"kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.940695 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.941002 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.941043 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 00:12:54 crc kubenswrapper[4911]: I1201 00:12:54.966375 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8p6\" (UniqueName: \"kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6\") pod \"redhat-marketplace-f4zfd\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.040714 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrl8\" (UniqueName: \"kubernetes.io/projected/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-kube-api-access-mvrl8\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.040765 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-catalog-content\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.041524 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-utilities\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.079196 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.142563 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-catalog-content\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.142710 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-utilities\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.142790 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrl8\" (UniqueName: \"kubernetes.io/projected/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-kube-api-access-mvrl8\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.143013 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-catalog-content\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.143430 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-utilities\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.163053 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrl8\" (UniqueName: \"kubernetes.io/projected/6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a-kube-api-access-mvrl8\") pod \"redhat-operators-8gsmv\" (UID: \"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a\") " pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.309554 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.485641 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:12:55 crc kubenswrapper[4911]: W1201 00:12:55.501355 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e7bcea_3402_4357_b5ff_3dd1067f9500.slice/crio-b82acdcd1575ae3d1a8811f0d70eb48b602de6af641714fd27339a76dffbc35a WatchSource:0}: Error finding container b82acdcd1575ae3d1a8811f0d70eb48b602de6af641714fd27339a76dffbc35a: Status 404 returned error can't find the container with id b82acdcd1575ae3d1a8811f0d70eb48b602de6af641714fd27339a76dffbc35a Dec 01 00:12:55 crc kubenswrapper[4911]: I1201 00:12:55.711693 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gsmv"] Dec 01 00:12:55 crc kubenswrapper[4911]: W1201 00:12:55.761184 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac2dde0_9bc9_4dec_ae5f_dcb44c02b24a.slice/crio-7a8f236b85a9b1e9fea9a6eca4c5901ea77b5aa9379c28ac753d4cf1183ab276 WatchSource:0}: Error finding container 7a8f236b85a9b1e9fea9a6eca4c5901ea77b5aa9379c28ac753d4cf1183ab276: Status 404 returned error can't find the container with id 7a8f236b85a9b1e9fea9a6eca4c5901ea77b5aa9379c28ac753d4cf1183ab276 Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.019169 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerID="adb7b1928e7a5759cf2d7ad36673b995f38d7272e4546dc7f191cd1145c08ba6" exitCode=0 Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.019235 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerDied","Data":"adb7b1928e7a5759cf2d7ad36673b995f38d7272e4546dc7f191cd1145c08ba6"} Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.019672 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerStarted","Data":"b82acdcd1575ae3d1a8811f0d70eb48b602de6af641714fd27339a76dffbc35a"} Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.022352 4911 generic.go:334] "Generic (PLEG): container finished" podID="6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a" containerID="3239fed09d5d24ff432d61a647a722df7732b1313f85dcc86c3c4136ff341ead" exitCode=0 Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.022483 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gsmv" event={"ID":"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a","Type":"ContainerDied","Data":"3239fed09d5d24ff432d61a647a722df7732b1313f85dcc86c3c4136ff341ead"} Dec 01 00:12:56 crc kubenswrapper[4911]: I1201 00:12:56.022566 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gsmv" event={"ID":"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a","Type":"ContainerStarted","Data":"7a8f236b85a9b1e9fea9a6eca4c5901ea77b5aa9379c28ac753d4cf1183ab276"} Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.126098 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gzf88"] Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.128521 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.131038 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.137176 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gzf88"] Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.171217 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxll\" (UniqueName: \"kubernetes.io/projected/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-kube-api-access-lrxll\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.171307 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-utilities\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.171339 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-catalog-content\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.272209 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxll\" (UniqueName: \"kubernetes.io/projected/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-kube-api-access-lrxll\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.272258 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-utilities\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.272280 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-catalog-content\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.272736 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-catalog-content\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.272816 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-utilities\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.297106 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxll\" (UniqueName: \"kubernetes.io/projected/7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d-kube-api-access-lrxll\") pod \"community-operators-gzf88\" (UID: \"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d\") " pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.319974 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wxg8r"] Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.320914 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.323200 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.335988 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxg8r"] Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.373110 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-catalog-content\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.373219 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmngr\" (UniqueName: \"kubernetes.io/projected/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-kube-api-access-zmngr\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.373240 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-utilities\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.463689 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.474623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmngr\" (UniqueName: \"kubernetes.io/projected/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-kube-api-access-zmngr\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.474692 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-utilities\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.474752 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-catalog-content\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.475534 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-catalog-content\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.476654 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-utilities\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.507501 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmngr\" (UniqueName: \"kubernetes.io/projected/f2e3a5b9-bf28-4144-8cc7-9c66843eccb5-kube-api-access-zmngr\") pod \"certified-operators-wxg8r\" (UID: \"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5\") " pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.641005 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:12:57 crc kubenswrapper[4911]: I1201 00:12:57.941977 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gzf88"] Dec 01 00:12:57 crc kubenswrapper[4911]: W1201 00:12:57.949805 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf4caa9_984a_4b1f_8fe8_23c79dd9df8d.slice/crio-8832e6e56fb42073970f76306f30ef964d816521d0f76c2e85c214daad51a48a WatchSource:0}: Error finding container 8832e6e56fb42073970f76306f30ef964d816521d0f76c2e85c214daad51a48a: Status 404 returned error can't find the container with id 8832e6e56fb42073970f76306f30ef964d816521d0f76c2e85c214daad51a48a Dec 01 00:12:58 crc kubenswrapper[4911]: I1201 00:12:58.041005 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzf88" event={"ID":"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d","Type":"ContainerStarted","Data":"8832e6e56fb42073970f76306f30ef964d816521d0f76c2e85c214daad51a48a"} Dec 01 00:12:58 crc kubenswrapper[4911]: I1201 00:12:58.071094 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxg8r"] Dec 01 00:12:58 crc kubenswrapper[4911]: W1201 00:12:58.074789 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e3a5b9_bf28_4144_8cc7_9c66843eccb5.slice/crio-cd5339657d704cecfc4945ddcbf65e56970e8a453912838b00877ea21d08a701 WatchSource:0}: Error finding container cd5339657d704cecfc4945ddcbf65e56970e8a453912838b00877ea21d08a701: Status 404 returned error can't find the container with id cd5339657d704cecfc4945ddcbf65e56970e8a453912838b00877ea21d08a701 Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.058223 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerID="445af9d995497e514f9523b3c3d39171c6a4de0620f2ea57ee058a4cebb6b618" exitCode=0 Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.058298 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerDied","Data":"445af9d995497e514f9523b3c3d39171c6a4de0620f2ea57ee058a4cebb6b618"} Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.064553 4911 generic.go:334] "Generic (PLEG): container finished" podID="7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d" containerID="f5df87a17da74ec6f51d1ff8241925ac538cff6dc0d5a71c0c23f5565af33289" exitCode=0 Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.065308 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzf88" event={"ID":"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d","Type":"ContainerDied","Data":"f5df87a17da74ec6f51d1ff8241925ac538cff6dc0d5a71c0c23f5565af33289"} Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.083794 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gsmv" event={"ID":"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a","Type":"ContainerStarted","Data":"f0cc473f3df0d1f7956291511cdaf75514b3d24a72251ec2caf13ccc66e2f09d"} Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.085501 4911 generic.go:334] "Generic (PLEG): container finished" podID="f2e3a5b9-bf28-4144-8cc7-9c66843eccb5" containerID="9a79a34eba35820c0101a6b982349e6d6b4ddfaed4a689e6303e4d7c4e78fab8" exitCode=0 Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.085577 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxg8r" event={"ID":"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5","Type":"ContainerDied","Data":"9a79a34eba35820c0101a6b982349e6d6b4ddfaed4a689e6303e4d7c4e78fab8"} Dec 01 00:12:59 crc kubenswrapper[4911]: I1201 00:12:59.085622 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxg8r" event={"ID":"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5","Type":"ContainerStarted","Data":"cd5339657d704cecfc4945ddcbf65e56970e8a453912838b00877ea21d08a701"} Dec 01 00:13:00 crc kubenswrapper[4911]: I1201 00:13:00.096958 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerStarted","Data":"9c3486da02a91cc12b6f2ab6e530b930f3ba24465751ef8e2677709e77e3b087"} Dec 01 00:13:00 crc kubenswrapper[4911]: I1201 00:13:00.100819 4911 generic.go:334] "Generic (PLEG): container finished" podID="6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a" containerID="f0cc473f3df0d1f7956291511cdaf75514b3d24a72251ec2caf13ccc66e2f09d" exitCode=0 Dec 01 00:13:00 crc kubenswrapper[4911]: I1201 00:13:00.100872 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gsmv" event={"ID":"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a","Type":"ContainerDied","Data":"f0cc473f3df0d1f7956291511cdaf75514b3d24a72251ec2caf13ccc66e2f09d"} Dec 01 00:13:00 crc kubenswrapper[4911]: I1201 00:13:00.149058 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f4zfd" podStartSLOduration=2.656525211 podStartE2EDuration="6.14904266s" podCreationTimestamp="2025-12-01 00:12:54 +0000 UTC" firstStartedPulling="2025-12-01 00:12:56.02143989 +0000 UTC m=+336.160136701" lastFinishedPulling="2025-12-01 00:12:59.513957379 +0000 UTC m=+339.652654150" observedRunningTime="2025-12-01 00:13:00.128423007 +0000 UTC m=+340.267119788" watchObservedRunningTime="2025-12-01 00:13:00.14904266 +0000 UTC m=+340.287739431" Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.116166 4911 generic.go:334] "Generic (PLEG): container finished" podID="f2e3a5b9-bf28-4144-8cc7-9c66843eccb5" containerID="869ae0acab75f8520f24b495e7112ad3db513b8632d71a39b1ebd03e90c4f431" exitCode=0 Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.116334 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxg8r" event={"ID":"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5","Type":"ContainerDied","Data":"869ae0acab75f8520f24b495e7112ad3db513b8632d71a39b1ebd03e90c4f431"} Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.121338 4911 generic.go:334] "Generic (PLEG): container finished" podID="7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d" containerID="3da8cae8341dde79e50af5c25b0d8ca236baf33179bd4e4cfdde2d066000f779" exitCode=0 Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.121426 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzf88" event={"ID":"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d","Type":"ContainerDied","Data":"3da8cae8341dde79e50af5c25b0d8ca236baf33179bd4e4cfdde2d066000f779"} Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.126650 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gsmv" event={"ID":"6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a","Type":"ContainerStarted","Data":"3d3bcf9155a1deac8a1e154e1f5b03f24f80e58fe2e5e4bce2dec0b4dfbc7532"} Dec 01 00:13:01 crc kubenswrapper[4911]: I1201 00:13:01.196362 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8gsmv" podStartSLOduration=2.538980252 podStartE2EDuration="7.196340908s" podCreationTimestamp="2025-12-01 00:12:54 +0000 UTC" firstStartedPulling="2025-12-01 00:12:56.025032018 +0000 UTC m=+336.163728799" lastFinishedPulling="2025-12-01 00:13:00.682392684 +0000 UTC m=+340.821089455" observedRunningTime="2025-12-01 00:13:01.168043476 +0000 UTC m=+341.306740257" watchObservedRunningTime="2025-12-01 00:13:01.196340908 +0000 UTC m=+341.335037689" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.134673 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzf88" event={"ID":"7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d","Type":"ContainerStarted","Data":"3e661f8fb849bece2bd4555636da657ce9d9260e74db38f900c521a807819a5b"} Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.137190 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxg8r" event={"ID":"f2e3a5b9-bf28-4144-8cc7-9c66843eccb5","Type":"ContainerStarted","Data":"09f67f771a4b43bda71c37e7af17f89e591d3d20f04d74f9ad035de2a876fc33"} Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.147399 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.147804 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" podUID="700ef04b-517a-40c1-964a-f935e19785c0" containerName="route-controller-manager" containerID="cri-o://68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b" gracePeriod=30 Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.155580 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gzf88" podStartSLOduration=2.5257384 podStartE2EDuration="5.155553301s" podCreationTimestamp="2025-12-01 00:12:57 +0000 UTC" firstStartedPulling="2025-12-01 00:12:59.070817958 +0000 UTC m=+339.209514739" lastFinishedPulling="2025-12-01 00:13:01.700632859 +0000 UTC m=+341.839329640" observedRunningTime="2025-12-01 00:13:02.152270902 +0000 UTC m=+342.290967683" watchObservedRunningTime="2025-12-01 00:13:02.155553301 +0000 UTC m=+342.294250102" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.184518 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wxg8r" podStartSLOduration=2.535843007 podStartE2EDuration="5.184490401s" podCreationTimestamp="2025-12-01 00:12:57 +0000 UTC" firstStartedPulling="2025-12-01 00:12:59.087859614 +0000 UTC m=+339.226556405" lastFinishedPulling="2025-12-01 00:13:01.736507018 +0000 UTC m=+341.875203799" observedRunningTime="2025-12-01 00:13:02.18113565 +0000 UTC m=+342.319832491" watchObservedRunningTime="2025-12-01 00:13:02.184490401 +0000 UTC m=+342.323187202" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.661866 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.853965 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfhff\" (UniqueName: \"kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff\") pod \"700ef04b-517a-40c1-964a-f935e19785c0\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.854027 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca\") pod \"700ef04b-517a-40c1-964a-f935e19785c0\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.854074 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config\") pod \"700ef04b-517a-40c1-964a-f935e19785c0\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.854146 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert\") pod \"700ef04b-517a-40c1-964a-f935e19785c0\" (UID: \"700ef04b-517a-40c1-964a-f935e19785c0\") " Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.855187 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "700ef04b-517a-40c1-964a-f935e19785c0" (UID: "700ef04b-517a-40c1-964a-f935e19785c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.855599 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config" (OuterVolumeSpecName: "config") pod "700ef04b-517a-40c1-964a-f935e19785c0" (UID: "700ef04b-517a-40c1-964a-f935e19785c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.894716 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "700ef04b-517a-40c1-964a-f935e19785c0" (UID: "700ef04b-517a-40c1-964a-f935e19785c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.898706 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff" (OuterVolumeSpecName: "kube-api-access-qfhff") pod "700ef04b-517a-40c1-964a-f935e19785c0" (UID: "700ef04b-517a-40c1-964a-f935e19785c0"). InnerVolumeSpecName "kube-api-access-qfhff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.955359 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.955431 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/700ef04b-517a-40c1-964a-f935e19785c0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.955481 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfhff\" (UniqueName: \"kubernetes.io/projected/700ef04b-517a-40c1-964a-f935e19785c0-kube-api-access-qfhff\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:02 crc kubenswrapper[4911]: I1201 00:13:02.955505 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/700ef04b-517a-40c1-964a-f935e19785c0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.147000 4911 generic.go:334] "Generic (PLEG): container finished" podID="700ef04b-517a-40c1-964a-f935e19785c0" containerID="68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b" exitCode=0 Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.147114 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.147091 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" event={"ID":"700ef04b-517a-40c1-964a-f935e19785c0","Type":"ContainerDied","Data":"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b"} Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.147850 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2" event={"ID":"700ef04b-517a-40c1-964a-f935e19785c0","Type":"ContainerDied","Data":"84e1642657520ee25fe13ca86198198162b0bca06bd8cc62d8689c7f5c30c806"} Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.147883 4911 scope.go:117] "RemoveContainer" containerID="68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.169197 4911 scope.go:117] "RemoveContainer" containerID="68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b" Dec 01 00:13:03 crc kubenswrapper[4911]: E1201 00:13:03.170100 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b\": container with ID starting with 68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b not found: ID does not exist" containerID="68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.170167 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b"} err="failed to get container status \"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b\": rpc error: code = NotFound desc = could not find container \"68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b\": container with ID starting with 68863295d5ad9b1ca51c5d2ec09cc02adc6c833674bc9ac4a9c7524067cdd50b not found: ID does not exist" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.187070 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.190078 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dcdbd9666-hqcm2"] Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.277257 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l"] Dec 01 00:13:03 crc kubenswrapper[4911]: E1201 00:13:03.277511 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700ef04b-517a-40c1-964a-f935e19785c0" containerName="route-controller-manager" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.277524 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="700ef04b-517a-40c1-964a-f935e19785c0" containerName="route-controller-manager" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.277653 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="700ef04b-517a-40c1-964a-f935e19785c0" containerName="route-controller-manager" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.278257 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.280659 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.283086 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.283109 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.283261 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.284698 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.285309 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.294712 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l"] Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.462803 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-client-ca\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.462868 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-serving-cert\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.462919 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svq9\" (UniqueName: \"kubernetes.io/projected/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-kube-api-access-7svq9\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.462959 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-config\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.564623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-client-ca\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.564785 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-serving-cert\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.564953 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svq9\" (UniqueName: \"kubernetes.io/projected/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-kube-api-access-7svq9\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.565030 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-config\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.566334 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-client-ca\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.566389 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-config\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.570314 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-serving-cert\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.581382 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svq9\" (UniqueName: \"kubernetes.io/projected/901a1249-a5e9-40f3-a04d-c48ab1fd1e6b-kube-api-access-7svq9\") pod \"route-controller-manager-54c7d6c77-dlq9l\" (UID: \"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:03 crc kubenswrapper[4911]: I1201 00:13:03.592264 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:04 crc kubenswrapper[4911]: I1201 00:13:04.049723 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l"] Dec 01 00:13:04 crc kubenswrapper[4911]: W1201 00:13:04.052825 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod901a1249_a5e9_40f3_a04d_c48ab1fd1e6b.slice/crio-4ea6e1e98e9bc35ab6f11c2085f7a8c24f989b8e7b3fa11d0aeab46bc0d47d12 WatchSource:0}: Error finding container 4ea6e1e98e9bc35ab6f11c2085f7a8c24f989b8e7b3fa11d0aeab46bc0d47d12: Status 404 returned error can't find the container with id 4ea6e1e98e9bc35ab6f11c2085f7a8c24f989b8e7b3fa11d0aeab46bc0d47d12 Dec 01 00:13:04 crc kubenswrapper[4911]: I1201 00:13:04.178213 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700ef04b-517a-40c1-964a-f935e19785c0" path="/var/lib/kubelet/pods/700ef04b-517a-40c1-964a-f935e19785c0/volumes" Dec 01 00:13:04 crc kubenswrapper[4911]: I1201 00:13:04.178978 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" event={"ID":"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b","Type":"ContainerStarted","Data":"4ea6e1e98e9bc35ab6f11c2085f7a8c24f989b8e7b3fa11d0aeab46bc0d47d12"} Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.079998 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.082432 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.149751 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.161280 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" event={"ID":"901a1249-a5e9-40f3-a04d-c48ab1fd1e6b","Type":"ContainerStarted","Data":"6d1ef988e47449c854aef11a2edfc9d3000437530c66b9c7842b7fcb48777274"} Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.213863 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.310082 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:13:05 crc kubenswrapper[4911]: I1201 00:13:05.310122 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:13:06 crc kubenswrapper[4911]: I1201 00:13:06.169792 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:06 crc kubenswrapper[4911]: I1201 00:13:06.179711 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" Dec 01 00:13:06 crc kubenswrapper[4911]: I1201 00:13:06.198496 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54c7d6c77-dlq9l" podStartSLOduration=4.198431358 podStartE2EDuration="4.198431358s" podCreationTimestamp="2025-12-01 00:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:13:06.197818821 +0000 UTC m=+346.336515642" watchObservedRunningTime="2025-12-01 00:13:06.198431358 +0000 UTC m=+346.337128159" Dec 01 00:13:06 crc kubenswrapper[4911]: I1201 00:13:06.347238 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8gsmv" podUID="6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a" containerName="registry-server" probeResult="failure" output=< Dec 01 00:13:06 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Dec 01 00:13:06 crc kubenswrapper[4911]: > Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.464644 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.464708 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.527109 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.642047 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.642100 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:13:07 crc kubenswrapper[4911]: I1201 00:13:07.679260 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:13:08 crc kubenswrapper[4911]: I1201 00:13:08.263642 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wxg8r" Dec 01 00:13:08 crc kubenswrapper[4911]: I1201 00:13:08.264692 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gzf88" Dec 01 00:13:15 crc kubenswrapper[4911]: I1201 00:13:15.379883 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:13:15 crc kubenswrapper[4911]: I1201 00:13:15.451277 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8gsmv" Dec 01 00:13:21 crc kubenswrapper[4911]: I1201 00:13:21.311563 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:13:21 crc kubenswrapper[4911]: I1201 00:13:21.312777 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:13:42 crc kubenswrapper[4911]: I1201 00:13:42.172764 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:13:42 crc kubenswrapper[4911]: I1201 00:13:42.173704 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" podUID="262f9eea-7a20-478e-a1a3-5a70fa377aef" containerName="controller-manager" containerID="cri-o://5e64fbad11b2fef108e385f3d4e1e9e9290ea799f58cffc5b3726a335854b3db" gracePeriod=30 Dec 01 00:13:42 crc kubenswrapper[4911]: I1201 00:13:42.426661 4911 generic.go:334] "Generic (PLEG): container finished" podID="262f9eea-7a20-478e-a1a3-5a70fa377aef" containerID="5e64fbad11b2fef108e385f3d4e1e9e9290ea799f58cffc5b3726a335854b3db" exitCode=0 Dec 01 00:13:42 crc kubenswrapper[4911]: I1201 00:13:42.426749 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" event={"ID":"262f9eea-7a20-478e-a1a3-5a70fa377aef","Type":"ContainerDied","Data":"5e64fbad11b2fef108e385f3d4e1e9e9290ea799f58cffc5b3726a335854b3db"} Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.231433 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.269878 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d9fb465d8-46nvs"] Dec 01 00:13:43 crc kubenswrapper[4911]: E1201 00:13:43.270145 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262f9eea-7a20-478e-a1a3-5a70fa377aef" containerName="controller-manager" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.270247 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="262f9eea-7a20-478e-a1a3-5a70fa377aef" containerName="controller-manager" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.270394 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="262f9eea-7a20-478e-a1a3-5a70fa377aef" containerName="controller-manager" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.270979 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.281673 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d9fb465d8-46nvs"] Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356350 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkz8d\" (UniqueName: \"kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d\") pod \"262f9eea-7a20-478e-a1a3-5a70fa377aef\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles\") pod \"262f9eea-7a20-478e-a1a3-5a70fa377aef\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356507 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert\") pod \"262f9eea-7a20-478e-a1a3-5a70fa377aef\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356526 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config\") pod \"262f9eea-7a20-478e-a1a3-5a70fa377aef\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356542 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca\") pod \"262f9eea-7a20-478e-a1a3-5a70fa377aef\" (UID: \"262f9eea-7a20-478e-a1a3-5a70fa377aef\") " Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356677 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmclr\" (UniqueName: \"kubernetes.io/projected/6cdaee09-8907-4aa8-8812-7e13b05024dd-kube-api-access-gmclr\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356721 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-proxy-ca-bundles\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356755 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdaee09-8907-4aa8-8812-7e13b05024dd-serving-cert\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356868 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-config\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.356990 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-client-ca\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.357157 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "262f9eea-7a20-478e-a1a3-5a70fa377aef" (UID: "262f9eea-7a20-478e-a1a3-5a70fa377aef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.357171 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca" (OuterVolumeSpecName: "client-ca") pod "262f9eea-7a20-478e-a1a3-5a70fa377aef" (UID: "262f9eea-7a20-478e-a1a3-5a70fa377aef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.357666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config" (OuterVolumeSpecName: "config") pod "262f9eea-7a20-478e-a1a3-5a70fa377aef" (UID: "262f9eea-7a20-478e-a1a3-5a70fa377aef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.366612 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "262f9eea-7a20-478e-a1a3-5a70fa377aef" (UID: "262f9eea-7a20-478e-a1a3-5a70fa377aef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.366658 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d" (OuterVolumeSpecName: "kube-api-access-vkz8d") pod "262f9eea-7a20-478e-a1a3-5a70fa377aef" (UID: "262f9eea-7a20-478e-a1a3-5a70fa377aef"). InnerVolumeSpecName "kube-api-access-vkz8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.435524 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" event={"ID":"262f9eea-7a20-478e-a1a3-5a70fa377aef","Type":"ContainerDied","Data":"06b234c5a332d57846dc6abfad2011f224a083e14cfc4f9049016e955afe5dec"} Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.435585 4911 scope.go:117] "RemoveContainer" containerID="5e64fbad11b2fef108e385f3d4e1e9e9290ea799f58cffc5b3726a335854b3db" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.435597 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fb864b4d-lvd6t" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.458990 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-client-ca\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459036 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmclr\" (UniqueName: \"kubernetes.io/projected/6cdaee09-8907-4aa8-8812-7e13b05024dd-kube-api-access-gmclr\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459074 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-proxy-ca-bundles\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459106 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdaee09-8907-4aa8-8812-7e13b05024dd-serving-cert\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459141 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-config\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459182 4911 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/262f9eea-7a20-478e-a1a3-5a70fa377aef-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459194 4911 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459205 4911 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459214 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkz8d\" (UniqueName: \"kubernetes.io/projected/262f9eea-7a20-478e-a1a3-5a70fa377aef-kube-api-access-vkz8d\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.459224 4911 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/262f9eea-7a20-478e-a1a3-5a70fa377aef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.463240 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-client-ca\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.463391 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-config\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.463646 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdaee09-8907-4aa8-8812-7e13b05024dd-proxy-ca-bundles\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.467181 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.470171 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdaee09-8907-4aa8-8812-7e13b05024dd-serving-cert\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.472617 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fb864b4d-lvd6t"] Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.477170 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmclr\" (UniqueName: \"kubernetes.io/projected/6cdaee09-8907-4aa8-8812-7e13b05024dd-kube-api-access-gmclr\") pod \"controller-manager-d9fb465d8-46nvs\" (UID: \"6cdaee09-8907-4aa8-8812-7e13b05024dd\") " pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.601127 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:43 crc kubenswrapper[4911]: I1201 00:13:43.860209 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d9fb465d8-46nvs"] Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.160655 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262f9eea-7a20-478e-a1a3-5a70fa377aef" path="/var/lib/kubelet/pods/262f9eea-7a20-478e-a1a3-5a70fa377aef/volumes" Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.443315 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" event={"ID":"6cdaee09-8907-4aa8-8812-7e13b05024dd","Type":"ContainerStarted","Data":"6c05ac765e4b9d9ccfb03ce4520e5f8b547ae3410595d847ee54a2bd9dc7396e"} Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.443790 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" event={"ID":"6cdaee09-8907-4aa8-8812-7e13b05024dd","Type":"ContainerStarted","Data":"5b575947cae0443476a96fc4cddae98a5ef4d543818dd0c5267e5efe2be6861f"} Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.444434 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.453168 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" Dec 01 00:13:44 crc kubenswrapper[4911]: I1201 00:13:44.488356 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d9fb465d8-46nvs" podStartSLOduration=2.488322578 podStartE2EDuration="2.488322578s" podCreationTimestamp="2025-12-01 00:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:13:44.465089874 +0000 UTC m=+384.603786655" watchObservedRunningTime="2025-12-01 00:13:44.488322578 +0000 UTC m=+384.627019349" Dec 01 00:13:51 crc kubenswrapper[4911]: I1201 00:13:51.311648 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:13:51 crc kubenswrapper[4911]: I1201 00:13:51.312341 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.311774 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.312585 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.312657 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.313511 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.313590 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386" gracePeriod=600 Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.713311 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386" exitCode=0 Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.713369 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386"} Dec 01 00:14:21 crc kubenswrapper[4911]: I1201 00:14:21.713416 4911 scope.go:117] "RemoveContainer" containerID="9a16a27cbcd606fd1b9295977d7c808c97f47a00be0d9a14d15b097a5ec54dd3" Dec 01 00:14:22 crc kubenswrapper[4911]: I1201 00:14:22.722504 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d"} Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.189037 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms"] Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.190367 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.193642 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.194276 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.203481 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms"] Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.325289 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.325414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.325502 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.426909 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.427031 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.427185 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.428823 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.439178 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.460797 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6\") pod \"collect-profiles-29409135-9kcms\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.519821 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.950309 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms"] Dec 01 00:15:00 crc kubenswrapper[4911]: I1201 00:15:00.997889 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" event={"ID":"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9","Type":"ContainerStarted","Data":"87ce0662f3dffa108be185acd2fbe5aa173482963cfe11a04f62a3c334bae038"} Dec 01 00:15:02 crc kubenswrapper[4911]: I1201 00:15:02.009433 4911 generic.go:334] "Generic (PLEG): container finished" podID="d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" containerID="8905e7131b9131cb08aba434d6f0b1d684ba80b7d7f23aa2cec72bfdb2eb9ce3" exitCode=0 Dec 01 00:15:02 crc kubenswrapper[4911]: I1201 00:15:02.009619 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" event={"ID":"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9","Type":"ContainerDied","Data":"8905e7131b9131cb08aba434d6f0b1d684ba80b7d7f23aa2cec72bfdb2eb9ce3"} Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.293156 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.469051 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume\") pod \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.469257 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6\") pod \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.469309 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume\") pod \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\" (UID: \"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9\") " Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.470012 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" (UID: "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.477713 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" (UID: "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.477759 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6" (OuterVolumeSpecName: "kube-api-access-hc6s6") pod "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" (UID: "d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9"). InnerVolumeSpecName "kube-api-access-hc6s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.570613 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-kube-api-access-hc6s6\") on node \"crc\" DevicePath \"\"" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.570642 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:15:03 crc kubenswrapper[4911]: I1201 00:15:03.570651 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:15:04 crc kubenswrapper[4911]: I1201 00:15:04.023949 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" event={"ID":"d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9","Type":"ContainerDied","Data":"87ce0662f3dffa108be185acd2fbe5aa173482963cfe11a04f62a3c334bae038"} Dec 01 00:15:04 crc kubenswrapper[4911]: I1201 00:15:04.023998 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87ce0662f3dffa108be185acd2fbe5aa173482963cfe11a04f62a3c334bae038" Dec 01 00:15:04 crc kubenswrapper[4911]: I1201 00:15:04.024067 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409135-9kcms" Dec 01 00:16:51 crc kubenswrapper[4911]: I1201 00:16:51.312438 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:16:51 crc kubenswrapper[4911]: I1201 00:16:51.313165 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:17:21 crc kubenswrapper[4911]: I1201 00:17:21.315773 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:17:21 crc kubenswrapper[4911]: I1201 00:17:21.316813 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.228023 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ptrhz"] Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229226 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-controller" containerID="cri-o://dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229271 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="nbdb" containerID="cri-o://b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229302 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229369 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-node" containerID="cri-o://ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229451 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-acl-logging" containerID="cri-o://17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229524 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="northd" containerID="cri-o://7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.229726 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="sbdb" containerID="cri-o://74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" gracePeriod=30 Dec 01 00:17:44 crc kubenswrapper[4911]: I1201 00:17:44.265404 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" containerID="cri-o://52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" gracePeriod=30 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.077111 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/3.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.080430 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovn-acl-logging/0.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.081053 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovn-controller/0.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.082071 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.141957 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hsdln"] Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142346 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kubecfg-setup" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142383 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kubecfg-setup" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142399 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142409 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142420 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" containerName="collect-profiles" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142428 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" containerName="collect-profiles" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142436 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-acl-logging" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142486 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-acl-logging" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142501 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142509 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142542 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-node" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142551 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-node" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142565 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="nbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142572 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="nbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142584 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142590 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142618 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142626 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142635 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142641 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142652 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="northd" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142659 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="northd" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142667 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="sbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142674 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="sbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.142705 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142713 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142856 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142870 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-acl-logging" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142879 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142890 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovn-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142899 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="northd" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142908 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="nbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142916 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142925 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="kube-rbac-proxy-node" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142935 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142944 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="sbdb" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.142975 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d537f08e-fe55-4c0d-a9ab-d2e699b1c7f9" containerName="collect-profiles" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.143101 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.143109 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.143234 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.143609 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerName="ovnkube-controller" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.145937 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.198202 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovnkube-controller/3.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.200970 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovn-acl-logging/0.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202170 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptrhz_d8af6f05-3ccd-4b80-b144-530b83bfdc62/ovn-controller/0.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202608 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202638 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202651 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202661 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202671 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202680 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" exitCode=0 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202688 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" exitCode=143 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202697 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" exitCode=143 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202671 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202758 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202774 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202776 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202817 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202830 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202844 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202906 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202924 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202931 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202939 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202946 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202952 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202961 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202989 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.202998 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203010 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203022 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203030 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203037 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203046 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203054 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203083 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203090 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203096 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203102 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203108 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203118 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203128 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203136 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203143 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203151 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203158 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203166 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203173 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203180 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203187 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203193 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203202 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" event={"ID":"d8af6f05-3ccd-4b80-b144-530b83bfdc62","Type":"ContainerDied","Data":"0cc0cae2b87e99de99af6e4b7b6f16b5a7cdf4913ebace932223268d99736127"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203212 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203222 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203229 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203235 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203242 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203248 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203257 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203265 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203273 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203282 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.203778 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptrhz" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.204823 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/2.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.205280 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/1.log" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.205313 4911 generic.go:334] "Generic (PLEG): container finished" podID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" containerID="1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5" exitCode=2 Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.205361 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerDied","Data":"1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.205378 4911 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77"} Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.205883 4911 scope.go:117] "RemoveContainer" containerID="1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.206101 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h54fr_openshift-multus(0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f)\"" pod="openshift-multus/multus-h54fr" podUID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.220710 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.237900 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238060 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238089 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238117 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238135 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238157 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238180 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238201 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238227 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238249 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238271 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238295 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238320 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238347 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238362 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238428 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgxc\" (UniqueName: \"kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238448 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238483 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238504 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238529 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units\") pod \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\" (UID: \"d8af6f05-3ccd-4b80-b144-530b83bfdc62\") " Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238631 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsp9\" (UniqueName: \"kubernetes.io/projected/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-kube-api-access-lxsp9\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238663 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-node-log\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238686 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-netns\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238709 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-etc-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238737 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-netd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238793 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-config\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238844 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-ovn\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238865 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-log-socket\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238904 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-systemd-units\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238931 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238954 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-slash\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238976 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-script-lib\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239000 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-systemd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239035 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovn-node-metrics-cert\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239065 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-kubelet\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239092 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-var-lib-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239111 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-env-overrides\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239190 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239218 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-bin\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.239242 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.238011 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log" (OuterVolumeSpecName: "node-log") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240277 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240382 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket" (OuterVolumeSpecName: "log-socket") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240444 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240508 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240525 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240603 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash" (OuterVolumeSpecName: "host-slash") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240651 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240708 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240784 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240814 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240843 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240832 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240878 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.240967 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.241170 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.241374 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.244489 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc" (OuterVolumeSpecName: "kube-api-access-trgxc") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "kube-api-access-trgxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.244743 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.250217 4911 scope.go:117] "RemoveContainer" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.253999 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d8af6f05-3ccd-4b80-b144-530b83bfdc62" (UID: "d8af6f05-3ccd-4b80-b144-530b83bfdc62"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.263602 4911 scope.go:117] "RemoveContainer" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.276390 4911 scope.go:117] "RemoveContainer" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.289895 4911 scope.go:117] "RemoveContainer" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.306805 4911 scope.go:117] "RemoveContainer" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.322154 4911 scope.go:117] "RemoveContainer" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.339414 4911 scope.go:117] "RemoveContainer" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.341919 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-log-socket\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342046 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-systemd-units\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342162 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342263 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-slash\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342356 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-script-lib\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342436 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-systemd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343241 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovn-node-metrics-cert\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343390 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-kubelet\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343497 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-var-lib-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343587 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-env-overrides\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343683 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343772 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-script-lib\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343776 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-bin\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343862 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsp9\" (UniqueName: \"kubernetes.io/projected/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-kube-api-access-lxsp9\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-node-log\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343982 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-netd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343990 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.343998 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-netns\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344019 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-etc-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344044 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-var-lib-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342654 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-log-socket\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344084 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-config\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344130 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-ovn\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344199 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-kubelet\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344242 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-ovn\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342672 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-systemd-units\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344212 4911 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344289 4911 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344305 4911 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344326 4911 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344339 4911 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344498 4911 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344516 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344529 4911 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344542 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-node-log\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344542 4911 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344589 4911 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344599 4911 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344609 4911 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344620 4911 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d8af6f05-3ccd-4b80-b144-530b83bfdc62-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344629 4911 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344638 4911 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344661 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trgxc\" (UniqueName: \"kubernetes.io/projected/d8af6f05-3ccd-4b80-b144-530b83bfdc62-kube-api-access-trgxc\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344670 4911 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344679 4911 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344688 4911 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344696 4911 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d8af6f05-3ccd-4b80-b144-530b83bfdc62-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344721 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-netd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344749 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-etc-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.344795 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-netns\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342623 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-systemd\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.345248 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-cni-bin\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.345345 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342727 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-host-slash\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.342707 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-run-openvswitch\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.345320 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovnkube-config\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.345666 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-env-overrides\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.350080 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-ovn-node-metrics-cert\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.367715 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsp9\" (UniqueName: \"kubernetes.io/projected/9c3d879c-d8e5-467c-b2a0-014ce6e147ff-kube-api-access-lxsp9\") pod \"ovnkube-node-hsdln\" (UID: \"9c3d879c-d8e5-467c-b2a0-014ce6e147ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.402230 4911 scope.go:117] "RemoveContainer" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.418933 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.419570 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.419614 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} err="failed to get container status \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.419635 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.420082 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": container with ID starting with 2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5 not found: ID does not exist" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.420142 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} err="failed to get container status \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": rpc error: code = NotFound desc = could not find container \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": container with ID starting with 2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.420184 4911 scope.go:117] "RemoveContainer" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.420611 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": container with ID starting with 74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6 not found: ID does not exist" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.420641 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} err="failed to get container status \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": rpc error: code = NotFound desc = could not find container \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": container with ID starting with 74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.420655 4911 scope.go:117] "RemoveContainer" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.420965 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": container with ID starting with b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0 not found: ID does not exist" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.420985 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} err="failed to get container status \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": rpc error: code = NotFound desc = could not find container \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": container with ID starting with b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421004 4911 scope.go:117] "RemoveContainer" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.421251 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": container with ID starting with 7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9 not found: ID does not exist" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421275 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} err="failed to get container status \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": rpc error: code = NotFound desc = could not find container \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": container with ID starting with 7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421293 4911 scope.go:117] "RemoveContainer" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.421551 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": container with ID starting with deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9 not found: ID does not exist" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421571 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} err="failed to get container status \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": rpc error: code = NotFound desc = could not find container \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": container with ID starting with deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421586 4911 scope.go:117] "RemoveContainer" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.421898 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": container with ID starting with ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1 not found: ID does not exist" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421918 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} err="failed to get container status \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": rpc error: code = NotFound desc = could not find container \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": container with ID starting with ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.421929 4911 scope.go:117] "RemoveContainer" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.422182 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": container with ID starting with 17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880 not found: ID does not exist" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.422279 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} err="failed to get container status \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": rpc error: code = NotFound desc = could not find container \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": container with ID starting with 17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.422308 4911 scope.go:117] "RemoveContainer" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.422673 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": container with ID starting with dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67 not found: ID does not exist" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.422703 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} err="failed to get container status \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": rpc error: code = NotFound desc = could not find container \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": container with ID starting with dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.422717 4911 scope.go:117] "RemoveContainer" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: E1201 00:17:45.423029 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": container with ID starting with b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155 not found: ID does not exist" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423063 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} err="failed to get container status \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": rpc error: code = NotFound desc = could not find container \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": container with ID starting with b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423081 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423362 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} err="failed to get container status \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423387 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423667 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} err="failed to get container status \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": rpc error: code = NotFound desc = could not find container \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": container with ID starting with 2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.423692 4911 scope.go:117] "RemoveContainer" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424123 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} err="failed to get container status \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": rpc error: code = NotFound desc = could not find container \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": container with ID starting with 74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424150 4911 scope.go:117] "RemoveContainer" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424491 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} err="failed to get container status \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": rpc error: code = NotFound desc = could not find container \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": container with ID starting with b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424520 4911 scope.go:117] "RemoveContainer" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424759 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} err="failed to get container status \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": rpc error: code = NotFound desc = could not find container \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": container with ID starting with 7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.424786 4911 scope.go:117] "RemoveContainer" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.425259 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} err="failed to get container status \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": rpc error: code = NotFound desc = could not find container \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": container with ID starting with deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.425289 4911 scope.go:117] "RemoveContainer" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.425554 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} err="failed to get container status \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": rpc error: code = NotFound desc = could not find container \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": container with ID starting with ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.425580 4911 scope.go:117] "RemoveContainer" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426058 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} err="failed to get container status \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": rpc error: code = NotFound desc = could not find container \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": container with ID starting with 17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426091 4911 scope.go:117] "RemoveContainer" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426352 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} err="failed to get container status \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": rpc error: code = NotFound desc = could not find container \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": container with ID starting with dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426380 4911 scope.go:117] "RemoveContainer" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426671 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} err="failed to get container status \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": rpc error: code = NotFound desc = could not find container \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": container with ID starting with b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426698 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.426981 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} err="failed to get container status \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.427024 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.427362 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} err="failed to get container status \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": rpc error: code = NotFound desc = could not find container \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": container with ID starting with 2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.427382 4911 scope.go:117] "RemoveContainer" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.427730 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} err="failed to get container status \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": rpc error: code = NotFound desc = could not find container \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": container with ID starting with 74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.427760 4911 scope.go:117] "RemoveContainer" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428009 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} err="failed to get container status \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": rpc error: code = NotFound desc = could not find container \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": container with ID starting with b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428029 4911 scope.go:117] "RemoveContainer" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428308 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} err="failed to get container status \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": rpc error: code = NotFound desc = could not find container \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": container with ID starting with 7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428336 4911 scope.go:117] "RemoveContainer" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428588 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} err="failed to get container status \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": rpc error: code = NotFound desc = could not find container \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": container with ID starting with deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428613 4911 scope.go:117] "RemoveContainer" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428850 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} err="failed to get container status \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": rpc error: code = NotFound desc = could not find container \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": container with ID starting with ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.428874 4911 scope.go:117] "RemoveContainer" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429096 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} err="failed to get container status \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": rpc error: code = NotFound desc = could not find container \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": container with ID starting with 17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429121 4911 scope.go:117] "RemoveContainer" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429362 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} err="failed to get container status \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": rpc error: code = NotFound desc = could not find container \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": container with ID starting with dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429388 4911 scope.go:117] "RemoveContainer" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429682 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} err="failed to get container status \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": rpc error: code = NotFound desc = could not find container \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": container with ID starting with b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429727 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.429968 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} err="failed to get container status \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430000 4911 scope.go:117] "RemoveContainer" containerID="2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430288 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5"} err="failed to get container status \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": rpc error: code = NotFound desc = could not find container \"2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5\": container with ID starting with 2ca7d8975ef9ec359d7e98b95e0ebc95b4a24c4754ca78a4ed609f86a06212f5 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430340 4911 scope.go:117] "RemoveContainer" containerID="74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430640 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6"} err="failed to get container status \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": rpc error: code = NotFound desc = could not find container \"74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6\": container with ID starting with 74ae63ddbd24a99893aefc4cad62c4971599d104c8384841302af544754573a6 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430661 4911 scope.go:117] "RemoveContainer" containerID="b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430892 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0"} err="failed to get container status \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": rpc error: code = NotFound desc = could not find container \"b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0\": container with ID starting with b0f62febd98f71d29708857dc7218de4ced30aad30e88fc81aaf3c0dd37e7ff0 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.430922 4911 scope.go:117] "RemoveContainer" containerID="7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431186 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9"} err="failed to get container status \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": rpc error: code = NotFound desc = could not find container \"7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9\": container with ID starting with 7138b261ffc5abdd858ebc9f9eb21db613a5798ad2f688f59b04ea6b37f30bb9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431204 4911 scope.go:117] "RemoveContainer" containerID="deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431434 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9"} err="failed to get container status \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": rpc error: code = NotFound desc = could not find container \"deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9\": container with ID starting with deb60d28daa72d755c9e3aebd08a809fac3bd00c33dea0f77c65c84ac8a7cde9 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431589 4911 scope.go:117] "RemoveContainer" containerID="ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431834 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1"} err="failed to get container status \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": rpc error: code = NotFound desc = could not find container \"ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1\": container with ID starting with ba053b0da0e532fe4907333c9e701519c3ef456f9f522d107768a9a7d7ea27e1 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.431873 4911 scope.go:117] "RemoveContainer" containerID="17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432115 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880"} err="failed to get container status \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": rpc error: code = NotFound desc = could not find container \"17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880\": container with ID starting with 17d9251cf9ef1c255082fcb34f25b145707164f2547968ac0cc96f62800df880 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432141 4911 scope.go:117] "RemoveContainer" containerID="dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432378 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67"} err="failed to get container status \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": rpc error: code = NotFound desc = could not find container \"dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67\": container with ID starting with dff0f85dbfa16fefb99bdcd7ce75585e26f43f4fad010c1c62526e11baa6db67 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432401 4911 scope.go:117] "RemoveContainer" containerID="b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432793 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155"} err="failed to get container status \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": rpc error: code = NotFound desc = could not find container \"b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155\": container with ID starting with b5ab9fbe12cf446576a432799f36b39c3f64e635dce50e0d612081618e14c155 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.432857 4911 scope.go:117] "RemoveContainer" containerID="52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.433197 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632"} err="failed to get container status \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": rpc error: code = NotFound desc = could not find container \"52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632\": container with ID starting with 52154f666623a99b11aab7f59f274451526183cba71f6e0496e80b5a54743632 not found: ID does not exist" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.469902 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.569334 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ptrhz"] Dec 01 00:17:45 crc kubenswrapper[4911]: I1201 00:17:45.574763 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ptrhz"] Dec 01 00:17:46 crc kubenswrapper[4911]: I1201 00:17:46.162603 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8af6f05-3ccd-4b80-b144-530b83bfdc62" path="/var/lib/kubelet/pods/d8af6f05-3ccd-4b80-b144-530b83bfdc62/volumes" Dec 01 00:17:46 crc kubenswrapper[4911]: I1201 00:17:46.214767 4911 generic.go:334] "Generic (PLEG): container finished" podID="9c3d879c-d8e5-467c-b2a0-014ce6e147ff" containerID="084f94f8032e3870d80a62dbf615efacc74b6899504092bfd96d2ee7c79a49d3" exitCode=0 Dec 01 00:17:46 crc kubenswrapper[4911]: I1201 00:17:46.214821 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerDied","Data":"084f94f8032e3870d80a62dbf615efacc74b6899504092bfd96d2ee7c79a49d3"} Dec 01 00:17:46 crc kubenswrapper[4911]: I1201 00:17:46.214856 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"bed054ad71efac5711b8fd37d4be1bf153768f1cc49705066cf3beb5537c3e53"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226174 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"528f02fc3874c05466db72a7780cc508213b8b78378aa8e5c7c7439d47103119"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226634 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"48d73d951468b5182cb78536775af7f71c711c5fdec7e214abb617baf7995a6f"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226655 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"c79be9aea6020653015e4bcd96b7a2c046308d6e679b480edc3d6aab0039ea1a"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226666 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"90e4d92385f3c65540b0be70557af6a73d2275e3bfb1c90485eb3e8e61fb7e8d"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226680 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"8825ddcb4bc0d5b71bad8c6f56e17b683e1e722fa6a58865389970b836bed1ed"} Dec 01 00:17:47 crc kubenswrapper[4911]: I1201 00:17:47.226689 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"5a3c2048a51ed7ca233e5844357a149c76ee2e82534699eda595a0ade5bbdfaf"} Dec 01 00:17:50 crc kubenswrapper[4911]: I1201 00:17:50.248520 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"24337f015387a4a9c08314a66c90880b955abd3f296bc960f1ed36f6e1e6672f"} Dec 01 00:17:51 crc kubenswrapper[4911]: I1201 00:17:51.311895 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:17:51 crc kubenswrapper[4911]: I1201 00:17:51.311969 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:17:51 crc kubenswrapper[4911]: I1201 00:17:51.312024 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:17:51 crc kubenswrapper[4911]: I1201 00:17:51.312645 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:17:51 crc kubenswrapper[4911]: I1201 00:17:51.312715 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d" gracePeriod=600 Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.263986 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d" exitCode=0 Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.264074 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d"} Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.264776 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7"} Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.264803 4911 scope.go:117] "RemoveContainer" containerID="40d2810c34239bb4bb2db3aad261028e5a8dee231ec9b175a243b041ac383386" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.277845 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" event={"ID":"9c3d879c-d8e5-467c-b2a0-014ce6e147ff","Type":"ContainerStarted","Data":"f02ba939dd07a9e7adf2f4fb7422829cfc4894dcde8de9846799d9be9bfa9d51"} Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.278836 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.278888 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.278908 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.309691 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.318356 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:17:52 crc kubenswrapper[4911]: I1201 00:17:52.322701 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" podStartSLOduration=7.322689439 podStartE2EDuration="7.322689439s" podCreationTimestamp="2025-12-01 00:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:17:52.320305193 +0000 UTC m=+632.459001964" watchObservedRunningTime="2025-12-01 00:17:52.322689439 +0000 UTC m=+632.461386210" Dec 01 00:17:56 crc kubenswrapper[4911]: I1201 00:17:56.152322 4911 scope.go:117] "RemoveContainer" containerID="1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5" Dec 01 00:17:56 crc kubenswrapper[4911]: E1201 00:17:56.153614 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h54fr_openshift-multus(0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f)\"" pod="openshift-multus/multus-h54fr" podUID="0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f" Dec 01 00:18:08 crc kubenswrapper[4911]: I1201 00:18:08.152722 4911 scope.go:117] "RemoveContainer" containerID="1c2934aaa0de303dd152f15c122a228a5a1fc72dc4064704c617c56c3180eea5" Dec 01 00:18:09 crc kubenswrapper[4911]: I1201 00:18:09.382348 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/2.log" Dec 01 00:18:09 crc kubenswrapper[4911]: I1201 00:18:09.384078 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/1.log" Dec 01 00:18:09 crc kubenswrapper[4911]: I1201 00:18:09.384136 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h54fr" event={"ID":"0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f","Type":"ContainerStarted","Data":"a999570e7a17afd72d5fd523e6c64c0469bafb5baf8b88924b9e496eb7ab91a9"} Dec 01 00:18:15 crc kubenswrapper[4911]: I1201 00:18:15.500990 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hsdln" Dec 01 00:18:20 crc kubenswrapper[4911]: I1201 00:18:20.462384 4911 scope.go:117] "RemoveContainer" containerID="44a30b0c8cb5dc15dd7ccc77d999bd70f74d71b253bc77bee77e6531552d3d77" Dec 01 00:18:24 crc kubenswrapper[4911]: I1201 00:18:24.500257 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h54fr_0fefe746-bc51-4bb4-a9b9-cc3dd29c2c0f/kube-multus/2.log" Dec 01 00:19:11 crc kubenswrapper[4911]: I1201 00:19:11.866214 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:19:11 crc kubenswrapper[4911]: I1201 00:19:11.867200 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f4zfd" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="registry-server" containerID="cri-o://9c3486da02a91cc12b6f2ab6e530b930f3ba24465751ef8e2677709e77e3b087" gracePeriod=30 Dec 01 00:19:13 crc kubenswrapper[4911]: I1201 00:19:13.842794 4911 generic.go:334] "Generic (PLEG): container finished" podID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerID="9c3486da02a91cc12b6f2ab6e530b930f3ba24465751ef8e2677709e77e3b087" exitCode=0 Dec 01 00:19:13 crc kubenswrapper[4911]: I1201 00:19:13.842919 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerDied","Data":"9c3486da02a91cc12b6f2ab6e530b930f3ba24465751ef8e2677709e77e3b087"} Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.050014 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.154325 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj8p6\" (UniqueName: \"kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6\") pod \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.154446 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities\") pod \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.154541 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content\") pod \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\" (UID: \"d8e7bcea-3402-4357-b5ff-3dd1067f9500\") " Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.155437 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities" (OuterVolumeSpecName: "utilities") pod "d8e7bcea-3402-4357-b5ff-3dd1067f9500" (UID: "d8e7bcea-3402-4357-b5ff-3dd1067f9500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.162092 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6" (OuterVolumeSpecName: "kube-api-access-dj8p6") pod "d8e7bcea-3402-4357-b5ff-3dd1067f9500" (UID: "d8e7bcea-3402-4357-b5ff-3dd1067f9500"). InnerVolumeSpecName "kube-api-access-dj8p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.177597 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8e7bcea-3402-4357-b5ff-3dd1067f9500" (UID: "d8e7bcea-3402-4357-b5ff-3dd1067f9500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.256570 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.256623 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj8p6\" (UniqueName: \"kubernetes.io/projected/d8e7bcea-3402-4357-b5ff-3dd1067f9500-kube-api-access-dj8p6\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.256651 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e7bcea-3402-4357-b5ff-3dd1067f9500-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.850782 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4zfd" event={"ID":"d8e7bcea-3402-4357-b5ff-3dd1067f9500","Type":"ContainerDied","Data":"b82acdcd1575ae3d1a8811f0d70eb48b602de6af641714fd27339a76dffbc35a"} Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.850832 4911 scope.go:117] "RemoveContainer" containerID="9c3486da02a91cc12b6f2ab6e530b930f3ba24465751ef8e2677709e77e3b087" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.850868 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4zfd" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.869651 4911 scope.go:117] "RemoveContainer" containerID="445af9d995497e514f9523b3c3d39171c6a4de0620f2ea57ee058a4cebb6b618" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.881929 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.885513 4911 scope.go:117] "RemoveContainer" containerID="adb7b1928e7a5759cf2d7ad36673b995f38d7272e4546dc7f191cd1145c08ba6" Dec 01 00:19:14 crc kubenswrapper[4911]: I1201 00:19:14.887662 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4zfd"] Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.874833 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg"] Dec 01 00:19:15 crc kubenswrapper[4911]: E1201 00:19:15.875381 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="registry-server" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.875395 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="registry-server" Dec 01 00:19:15 crc kubenswrapper[4911]: E1201 00:19:15.875409 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="extract-content" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.875417 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="extract-content" Dec 01 00:19:15 crc kubenswrapper[4911]: E1201 00:19:15.875440 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="extract-utilities" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.875448 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="extract-utilities" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.875570 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" containerName="registry-server" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.876254 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.879924 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.900929 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg"] Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.976561 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.976645 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm57d\" (UniqueName: \"kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:15 crc kubenswrapper[4911]: I1201 00:19:15.976878 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.078681 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.078745 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm57d\" (UniqueName: \"kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.078790 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.079447 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.079732 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.099596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm57d\" (UniqueName: \"kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.158157 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e7bcea-3402-4357-b5ff-3dd1067f9500" path="/var/lib/kubelet/pods/d8e7bcea-3402-4357-b5ff-3dd1067f9500/volumes" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.208817 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.409754 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg"] Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.868501 4911 generic.go:334] "Generic (PLEG): container finished" podID="a48e7868-4393-43db-afbe-4752ecf2c918" containerID="0a93975b3f9c119811d5ef2dec925fd05d99728db4a7182baa835c261f07a8c7" exitCode=0 Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.868560 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" event={"ID":"a48e7868-4393-43db-afbe-4752ecf2c918","Type":"ContainerDied","Data":"0a93975b3f9c119811d5ef2dec925fd05d99728db4a7182baa835c261f07a8c7"} Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.868601 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" event={"ID":"a48e7868-4393-43db-afbe-4752ecf2c918","Type":"ContainerStarted","Data":"5ba5203e787f51707ed718e1d019ed1357f2ee01ca486236a82f06289f97d382"} Dec 01 00:19:16 crc kubenswrapper[4911]: I1201 00:19:16.871047 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 00:19:18 crc kubenswrapper[4911]: I1201 00:19:18.884917 4911 generic.go:334] "Generic (PLEG): container finished" podID="a48e7868-4393-43db-afbe-4752ecf2c918" containerID="68c19341383474334609f109b2a2e815cbaa6a63fd37318c4d4dd91b7ee348ce" exitCode=0 Dec 01 00:19:18 crc kubenswrapper[4911]: I1201 00:19:18.885159 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" event={"ID":"a48e7868-4393-43db-afbe-4752ecf2c918","Type":"ContainerDied","Data":"68c19341383474334609f109b2a2e815cbaa6a63fd37318c4d4dd91b7ee348ce"} Dec 01 00:19:19 crc kubenswrapper[4911]: I1201 00:19:19.897262 4911 generic.go:334] "Generic (PLEG): container finished" podID="a48e7868-4393-43db-afbe-4752ecf2c918" containerID="63bcad6de0f45bfe3da46f92bab0a70362ad1d7587514ed3c41e13eca4fef9cb" exitCode=0 Dec 01 00:19:19 crc kubenswrapper[4911]: I1201 00:19:19.897780 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" event={"ID":"a48e7868-4393-43db-afbe-4752ecf2c918","Type":"ContainerDied","Data":"63bcad6de0f45bfe3da46f92bab0a70362ad1d7587514ed3c41e13eca4fef9cb"} Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.183248 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.359538 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm57d\" (UniqueName: \"kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d\") pod \"a48e7868-4393-43db-afbe-4752ecf2c918\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.359618 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle\") pod \"a48e7868-4393-43db-afbe-4752ecf2c918\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.359652 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util\") pod \"a48e7868-4393-43db-afbe-4752ecf2c918\" (UID: \"a48e7868-4393-43db-afbe-4752ecf2c918\") " Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.364154 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle" (OuterVolumeSpecName: "bundle") pod "a48e7868-4393-43db-afbe-4752ecf2c918" (UID: "a48e7868-4393-43db-afbe-4752ecf2c918"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.365526 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d" (OuterVolumeSpecName: "kube-api-access-mm57d") pod "a48e7868-4393-43db-afbe-4752ecf2c918" (UID: "a48e7868-4393-43db-afbe-4752ecf2c918"). InnerVolumeSpecName "kube-api-access-mm57d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.395428 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util" (OuterVolumeSpecName: "util") pod "a48e7868-4393-43db-afbe-4752ecf2c918" (UID: "a48e7868-4393-43db-afbe-4752ecf2c918"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.461932 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm57d\" (UniqueName: \"kubernetes.io/projected/a48e7868-4393-43db-afbe-4752ecf2c918-kube-api-access-mm57d\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.462223 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.462388 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a48e7868-4393-43db-afbe-4752ecf2c918-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.917890 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" event={"ID":"a48e7868-4393-43db-afbe-4752ecf2c918","Type":"ContainerDied","Data":"5ba5203e787f51707ed718e1d019ed1357f2ee01ca486236a82f06289f97d382"} Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.917950 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba5203e787f51707ed718e1d019ed1357f2ee01ca486236a82f06289f97d382" Dec 01 00:19:21 crc kubenswrapper[4911]: I1201 00:19:21.917954 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.275736 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws"] Dec 01 00:19:22 crc kubenswrapper[4911]: E1201 00:19:22.276112 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="extract" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.276136 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="extract" Dec 01 00:19:22 crc kubenswrapper[4911]: E1201 00:19:22.276156 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="pull" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.276170 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="pull" Dec 01 00:19:22 crc kubenswrapper[4911]: E1201 00:19:22.276207 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="util" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.276221 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="util" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.276404 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48e7868-4393-43db-afbe-4752ecf2c918" containerName="extract" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.277726 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.279604 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.292280 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws"] Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.373399 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.373494 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skqbd\" (UniqueName: \"kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.373665 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.474623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skqbd\" (UniqueName: \"kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.474705 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.474825 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.475601 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.475712 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.497422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skqbd\" (UniqueName: \"kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.596607 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.869783 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws"] Dec 01 00:19:22 crc kubenswrapper[4911]: I1201 00:19:22.927197 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerStarted","Data":"db50c177da979a66bd14caf949e86708a3baac01f3e4477dd2f4b8cd0db555c5"} Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.067248 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq"] Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.068279 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.078497 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq"] Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.186334 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.186413 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.186662 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2254k\" (UniqueName: \"kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.288360 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2254k\" (UniqueName: \"kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.288592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.288665 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.289504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.289737 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.311998 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2254k\" (UniqueName: \"kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.390604 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.667542 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq"] Dec 01 00:19:23 crc kubenswrapper[4911]: W1201 00:19:23.672676 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02a088d_f2be_4aaf_bca1_fa4858cea430.slice/crio-c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5 WatchSource:0}: Error finding container c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5: Status 404 returned error can't find the container with id c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5 Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.937170 4911 generic.go:334] "Generic (PLEG): container finished" podID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerID="7e5ec89c673321d91a6efebc7f84176dc2fccbbb35b8fab40fec37c60bfa7392" exitCode=0 Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.937276 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerDied","Data":"7e5ec89c673321d91a6efebc7f84176dc2fccbbb35b8fab40fec37c60bfa7392"} Dec 01 00:19:23 crc kubenswrapper[4911]: I1201 00:19:23.939065 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerStarted","Data":"c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5"} Dec 01 00:19:24 crc kubenswrapper[4911]: I1201 00:19:24.948069 4911 generic.go:334] "Generic (PLEG): container finished" podID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerID="017a3a54cca03d4a90cfab9922a7190c46f5135afa6b0d177d6748f455edae96" exitCode=0 Dec 01 00:19:24 crc kubenswrapper[4911]: I1201 00:19:24.948195 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerDied","Data":"017a3a54cca03d4a90cfab9922a7190c46f5135afa6b0d177d6748f455edae96"} Dec 01 00:19:29 crc kubenswrapper[4911]: I1201 00:19:29.042895 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerStarted","Data":"f3d1439b4fd232a412617a48198a6988d59831bb140a86df9b49574e6fa75abe"} Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.039665 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77"] Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.041169 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.052674 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77"] Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.158007 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.158068 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pz6q\" (UniqueName: \"kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.158301 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.260081 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.260232 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.260320 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pz6q\" (UniqueName: \"kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.260577 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.261072 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.322806 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pz6q\" (UniqueName: \"kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:31 crc kubenswrapper[4911]: I1201 00:19:31.364662 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:19:33 crc kubenswrapper[4911]: I1201 00:19:33.064871 4911 generic.go:334] "Generic (PLEG): container finished" podID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerID="f3d1439b4fd232a412617a48198a6988d59831bb140a86df9b49574e6fa75abe" exitCode=0 Dec 01 00:19:33 crc kubenswrapper[4911]: I1201 00:19:33.064941 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerDied","Data":"f3d1439b4fd232a412617a48198a6988d59831bb140a86df9b49574e6fa75abe"} Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.205358 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.206614 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.209080 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.209375 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-cz8wf" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.220245 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.250939 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.308198 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx944\" (UniqueName: \"kubernetes.io/projected/3dcb1345-86cc-4712-a549-5ec7b06343f3-kube-api-access-vx944\") pod \"obo-prometheus-operator-668cf9dfbb-twxt7\" (UID: \"3dcb1345-86cc-4712-a549-5ec7b06343f3\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.333586 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.334346 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.341021 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qsk74" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.341186 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.359282 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.364539 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.365371 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.378151 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.410585 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx944\" (UniqueName: \"kubernetes.io/projected/3dcb1345-86cc-4712-a549-5ec7b06343f3-kube-api-access-vx944\") pod \"obo-prometheus-operator-668cf9dfbb-twxt7\" (UID: \"3dcb1345-86cc-4712-a549-5ec7b06343f3\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.441241 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx944\" (UniqueName: \"kubernetes.io/projected/3dcb1345-86cc-4712-a549-5ec7b06343f3-kube-api-access-vx944\") pod \"obo-prometheus-operator-668cf9dfbb-twxt7\" (UID: \"3dcb1345-86cc-4712-a549-5ec7b06343f3\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.516020 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.516081 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.516122 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.516149 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.523586 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.528249 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hmcx6"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.528950 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.534605 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.534818 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dvxzn" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.546174 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hmcx6"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.617719 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.617985 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.618079 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.618111 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.618135 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.618158 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqrw\" (UniqueName: \"kubernetes.io/projected/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-kube-api-access-cvqrw\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.621623 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.623912 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.699611 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d8a-ec46-415a-b9bb-357493c28dda-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-rvs92\" (UID: \"ac116d8a-ec46-415a-b9bb-357493c28dda\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.700594 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b5103e9-aa5a-402b-a755-a2f2be984479-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh\" (UID: \"6b5103e9-aa5a-402b-a755-a2f2be984479\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.705864 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9sg5q"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.706556 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: W1201 00:19:35.708757 4911 reflector.go:561] object-"openshift-operators"/"perses-operator-dockercfg-mhgx2": failed to list *v1.Secret: secrets "perses-operator-dockercfg-mhgx2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 01 00:19:35 crc kubenswrapper[4911]: E1201 00:19:35.708809 4911 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"perses-operator-dockercfg-mhgx2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"perses-operator-dockercfg-mhgx2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.719049 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvzxk\" (UniqueName: \"kubernetes.io/projected/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-kube-api-access-kvzxk\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.719177 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.719217 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqrw\" (UniqueName: \"kubernetes.io/projected/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-kube-api-access-cvqrw\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.719250 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-openshift-service-ca\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.873333 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.873418 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-openshift-service-ca\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.873507 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvzxk\" (UniqueName: \"kubernetes.io/projected/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-kube-api-access-kvzxk\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.874674 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-openshift-service-ca\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.881669 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9sg5q"] Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.894520 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqrw\" (UniqueName: \"kubernetes.io/projected/b29bd7a0-85c7-43bc-8bab-adcafae9d8dc-kube-api-access-cvqrw\") pod \"observability-operator-d8bb48f5d-hmcx6\" (UID: \"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc\") " pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.904659 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvzxk\" (UniqueName: \"kubernetes.io/projected/b2a8cc4e-d5e8-4825-a42b-9b8534030ff8-kube-api-access-kvzxk\") pod \"perses-operator-5446b9c989-9sg5q\" (UID: \"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8\") " pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:35 crc kubenswrapper[4911]: I1201 00:19:35.924016 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.059018 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.060615 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.193653 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerStarted","Data":"2ff2d2b3b078a91620508ec87d173a63c40b341dce0b6e5b1f6042c0c1baf3b6"} Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.442200 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" podStartSLOduration=10.774835659 podStartE2EDuration="14.442177683s" podCreationTimestamp="2025-12-01 00:19:22 +0000 UTC" firstStartedPulling="2025-12-01 00:19:23.939972248 +0000 UTC m=+724.078669019" lastFinishedPulling="2025-12-01 00:19:27.607314272 +0000 UTC m=+727.746011043" observedRunningTime="2025-12-01 00:19:36.230325543 +0000 UTC m=+736.369022314" watchObservedRunningTime="2025-12-01 00:19:36.442177683 +0000 UTC m=+736.580874464" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.444871 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77"] Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.756704 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7"] Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.827896 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-mhgx2" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.832650 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.841956 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh"] Dec 01 00:19:36 crc kubenswrapper[4911]: W1201 00:19:36.853830 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5103e9_aa5a_402b_a755_a2f2be984479.slice/crio-f4624a36e18a1c03617a5f3cc340bb815626b10271a13aefa058a2fe8d43334e WatchSource:0}: Error finding container f4624a36e18a1c03617a5f3cc340bb815626b10271a13aefa058a2fe8d43334e: Status 404 returned error can't find the container with id f4624a36e18a1c03617a5f3cc340bb815626b10271a13aefa058a2fe8d43334e Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.868801 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hmcx6"] Dec 01 00:19:36 crc kubenswrapper[4911]: W1201 00:19:36.874649 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29bd7a0_85c7_43bc_8bab_adcafae9d8dc.slice/crio-424c003a16a6718eb3e85d916667341d55ab3ebeef279f57c6de6521711471e0 WatchSource:0}: Error finding container 424c003a16a6718eb3e85d916667341d55ab3ebeef279f57c6de6521711471e0: Status 404 returned error can't find the container with id 424c003a16a6718eb3e85d916667341d55ab3ebeef279f57c6de6521711471e0 Dec 01 00:19:36 crc kubenswrapper[4911]: I1201 00:19:36.876160 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92"] Dec 01 00:19:36 crc kubenswrapper[4911]: W1201 00:19:36.884701 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac116d8a_ec46_415a_b9bb_357493c28dda.slice/crio-9f6b5bbfd27daf4e34e2832bb68ebc03f931d9440084fd93c40541a8a6b2581c WatchSource:0}: Error finding container 9f6b5bbfd27daf4e34e2832bb68ebc03f931d9440084fd93c40541a8a6b2581c: Status 404 returned error can't find the container with id 9f6b5bbfd27daf4e34e2832bb68ebc03f931d9440084fd93c40541a8a6b2581c Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.034353 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9sg5q"] Dec 01 00:19:37 crc kubenswrapper[4911]: W1201 00:19:37.040953 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a8cc4e_d5e8_4825_a42b_9b8534030ff8.slice/crio-410ec0440d22c7e7f8bdfd538b1aee469a0f765c4a2e8915dee221f8b49d0b8f WatchSource:0}: Error finding container 410ec0440d22c7e7f8bdfd538b1aee469a0f765c4a2e8915dee221f8b49d0b8f: Status 404 returned error can't find the container with id 410ec0440d22c7e7f8bdfd538b1aee469a0f765c4a2e8915dee221f8b49d0b8f Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.201530 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" event={"ID":"6b5103e9-aa5a-402b-a755-a2f2be984479","Type":"ContainerStarted","Data":"f4624a36e18a1c03617a5f3cc340bb815626b10271a13aefa058a2fe8d43334e"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.203515 4911 generic.go:334] "Generic (PLEG): container finished" podID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerID="2ff2d2b3b078a91620508ec87d173a63c40b341dce0b6e5b1f6042c0c1baf3b6" exitCode=0 Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.203566 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerDied","Data":"2ff2d2b3b078a91620508ec87d173a63c40b341dce0b6e5b1f6042c0c1baf3b6"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.204419 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" event={"ID":"7d8f6326-b67c-4682-a37b-a79fb151552c","Type":"ContainerStarted","Data":"56a0137fe5b620cd0eb0bccf4bbd2195723ea63071dc2898051f23a59609cdbb"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.205650 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" event={"ID":"ac116d8a-ec46-415a-b9bb-357493c28dda","Type":"ContainerStarted","Data":"9f6b5bbfd27daf4e34e2832bb68ebc03f931d9440084fd93c40541a8a6b2581c"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.206587 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" event={"ID":"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc","Type":"ContainerStarted","Data":"424c003a16a6718eb3e85d916667341d55ab3ebeef279f57c6de6521711471e0"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.207475 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" event={"ID":"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8","Type":"ContainerStarted","Data":"410ec0440d22c7e7f8bdfd538b1aee469a0f765c4a2e8915dee221f8b49d0b8f"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.208411 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" event={"ID":"3dcb1345-86cc-4712-a549-5ec7b06343f3","Type":"ContainerStarted","Data":"9c7e3d1b752e4392e35e0953585a96d47b3278c95447b72363be741dbff1964c"} Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.209983 4911 generic.go:334] "Generic (PLEG): container finished" podID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerID="5bdbc46303bde9c038ab03b5bb3797c0dc66979cc7fd60129d1b27979bce0581" exitCode=0 Dec 01 00:19:37 crc kubenswrapper[4911]: I1201 00:19:37.210010 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerDied","Data":"5bdbc46303bde9c038ab03b5bb3797c0dc66979cc7fd60129d1b27979bce0581"} Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.215861 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerStarted","Data":"d47c5e98fadc5ed70329281c44c66b6d5bd0d75226fc755d2b569725ee61fc69"} Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.216863 4911 generic.go:334] "Generic (PLEG): container finished" podID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerID="134df8a09aecf0a0f131d8e4387e644d200aea891a7d906a5c7013f3980efc38" exitCode=0 Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.216922 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" event={"ID":"7d8f6326-b67c-4682-a37b-a79fb151552c","Type":"ContainerDied","Data":"134df8a09aecf0a0f131d8e4387e644d200aea891a7d906a5c7013f3980efc38"} Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.238016 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" podStartSLOduration=4.614979497 podStartE2EDuration="15.238000967s" podCreationTimestamp="2025-12-01 00:19:23 +0000 UTC" firstStartedPulling="2025-12-01 00:19:24.949648274 +0000 UTC m=+725.088345045" lastFinishedPulling="2025-12-01 00:19:35.572669744 +0000 UTC m=+735.711366515" observedRunningTime="2025-12-01 00:19:38.235016933 +0000 UTC m=+738.373713714" watchObservedRunningTime="2025-12-01 00:19:38.238000967 +0000 UTC m=+738.376697738" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.490224 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.641619 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skqbd\" (UniqueName: \"kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd\") pod \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.641762 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util\") pod \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.641832 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle\") pod \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\" (UID: \"25ac6dcc-fada-4401-91b8-a5d37711b1ec\") " Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.643004 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle" (OuterVolumeSpecName: "bundle") pod "25ac6dcc-fada-4401-91b8-a5d37711b1ec" (UID: "25ac6dcc-fada-4401-91b8-a5d37711b1ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.648711 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd" (OuterVolumeSpecName: "kube-api-access-skqbd") pod "25ac6dcc-fada-4401-91b8-a5d37711b1ec" (UID: "25ac6dcc-fada-4401-91b8-a5d37711b1ec"). InnerVolumeSpecName "kube-api-access-skqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.673920 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util" (OuterVolumeSpecName: "util") pod "25ac6dcc-fada-4401-91b8-a5d37711b1ec" (UID: "25ac6dcc-fada-4401-91b8-a5d37711b1ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.743315 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.743355 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25ac6dcc-fada-4401-91b8-a5d37711b1ec-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:38 crc kubenswrapper[4911]: I1201 00:19:38.743369 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skqbd\" (UniqueName: \"kubernetes.io/projected/25ac6dcc-fada-4401-91b8-a5d37711b1ec-kube-api-access-skqbd\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:39 crc kubenswrapper[4911]: I1201 00:19:39.226309 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" Dec 01 00:19:39 crc kubenswrapper[4911]: I1201 00:19:39.226595 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws" event={"ID":"25ac6dcc-fada-4401-91b8-a5d37711b1ec","Type":"ContainerDied","Data":"db50c177da979a66bd14caf949e86708a3baac01f3e4477dd2f4b8cd0db555c5"} Dec 01 00:19:39 crc kubenswrapper[4911]: I1201 00:19:39.226648 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db50c177da979a66bd14caf949e86708a3baac01f3e4477dd2f4b8cd0db555c5" Dec 01 00:19:41 crc kubenswrapper[4911]: I1201 00:19:41.237657 4911 generic.go:334] "Generic (PLEG): container finished" podID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerID="d47c5e98fadc5ed70329281c44c66b6d5bd0d75226fc755d2b569725ee61fc69" exitCode=0 Dec 01 00:19:41 crc kubenswrapper[4911]: I1201 00:19:41.237720 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerDied","Data":"d47c5e98fadc5ed70329281c44c66b6d5bd0d75226fc755d2b569725ee61fc69"} Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.783788 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.799075 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2254k\" (UniqueName: \"kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k\") pod \"e02a088d-f2be-4aaf-bca1-fa4858cea430\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.799142 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util\") pod \"e02a088d-f2be-4aaf-bca1-fa4858cea430\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.799181 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle\") pod \"e02a088d-f2be-4aaf-bca1-fa4858cea430\" (UID: \"e02a088d-f2be-4aaf-bca1-fa4858cea430\") " Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.800426 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle" (OuterVolumeSpecName: "bundle") pod "e02a088d-f2be-4aaf-bca1-fa4858cea430" (UID: "e02a088d-f2be-4aaf-bca1-fa4858cea430"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.807648 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k" (OuterVolumeSpecName: "kube-api-access-2254k") pod "e02a088d-f2be-4aaf-bca1-fa4858cea430" (UID: "e02a088d-f2be-4aaf-bca1-fa4858cea430"). InnerVolumeSpecName "kube-api-access-2254k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.812553 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util" (OuterVolumeSpecName: "util") pod "e02a088d-f2be-4aaf-bca1-fa4858cea430" (UID: "e02a088d-f2be-4aaf-bca1-fa4858cea430"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.900467 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.900496 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2254k\" (UniqueName: \"kubernetes.io/projected/e02a088d-f2be-4aaf-bca1-fa4858cea430-kube-api-access-2254k\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:42 crc kubenswrapper[4911]: I1201 00:19:42.900506 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e02a088d-f2be-4aaf-bca1-fa4858cea430-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:19:43 crc kubenswrapper[4911]: I1201 00:19:43.256155 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" event={"ID":"e02a088d-f2be-4aaf-bca1-fa4858cea430","Type":"ContainerDied","Data":"c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5"} Dec 01 00:19:43 crc kubenswrapper[4911]: I1201 00:19:43.256202 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c31552d6ad8610f5bd3841d9e22d68e3e71775338e1fe6995232c76ad9a838e5" Dec 01 00:19:43 crc kubenswrapper[4911]: I1201 00:19:43.256182 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.985577 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-b69jt"] Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986682 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986702 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986723 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="pull" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986734 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="pull" Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986753 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="util" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986764 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="util" Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986785 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="util" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986795 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="util" Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986810 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986821 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: E1201 00:19:50.986840 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="pull" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.986851 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="pull" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.987004 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ac6dcc-fada-4401-91b8-a5d37711b1ec" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.987026 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02a088d-f2be-4aaf-bca1-fa4858cea430" containerName="extract" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.987678 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.989585 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-v9lg6" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.989769 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.990071 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Dec 01 00:19:50 crc kubenswrapper[4911]: I1201 00:19:50.992974 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-b69jt"] Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.059041 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lmt\" (UniqueName: \"kubernetes.io/projected/64b5c170-0885-429f-b015-6443b497472e-kube-api-access-48lmt\") pod \"interconnect-operator-5bb49f789d-b69jt\" (UID: \"64b5c170-0885-429f-b015-6443b497472e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.160117 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lmt\" (UniqueName: \"kubernetes.io/projected/64b5c170-0885-429f-b015-6443b497472e-kube-api-access-48lmt\") pod \"interconnect-operator-5bb49f789d-b69jt\" (UID: \"64b5c170-0885-429f-b015-6443b497472e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.181654 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lmt\" (UniqueName: \"kubernetes.io/projected/64b5c170-0885-429f-b015-6443b497472e-kube-api-access-48lmt\") pod \"interconnect-operator-5bb49f789d-b69jt\" (UID: \"64b5c170-0885-429f-b015-6443b497472e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.431596 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.436791 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:19:51 crc kubenswrapper[4911]: I1201 00:19:51.436865 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:19:52 crc kubenswrapper[4911]: I1201 00:19:52.895484 4911 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.125581 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-b4fbd6456-hj9jg"] Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.126553 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.128768 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-d2df6" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.129612 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.136769 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b4fbd6456-hj9jg"] Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.229835 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzcj\" (UniqueName: \"kubernetes.io/projected/c870d438-0de1-4af7-b63e-7f9d90128a0e-kube-api-access-rxzcj\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.229921 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-apiservice-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.229997 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-webhook-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.331294 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-webhook-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.331383 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzcj\" (UniqueName: \"kubernetes.io/projected/c870d438-0de1-4af7-b63e-7f9d90128a0e-kube-api-access-rxzcj\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.331416 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-apiservice-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.416019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-webhook-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.416020 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c870d438-0de1-4af7-b63e-7f9d90128a0e-apiservice-cert\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.416422 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzcj\" (UniqueName: \"kubernetes.io/projected/c870d438-0de1-4af7-b63e-7f9d90128a0e-kube-api-access-rxzcj\") pod \"elastic-operator-b4fbd6456-hj9jg\" (UID: \"c870d438-0de1-4af7-b63e-7f9d90128a0e\") " pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:19:53 crc kubenswrapper[4911]: I1201 00:19:53.444537 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" Dec 01 00:20:06 crc kubenswrapper[4911]: I1201 00:20:06.515904 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b4fbd6456-hj9jg"] Dec 01 00:20:12 crc kubenswrapper[4911]: W1201 00:20:12.729799 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc870d438_0de1_4af7_b63e_7f9d90128a0e.slice/crio-a7955c8a6447e43d09685fd73f92841b84c03be3c5b6cd0faec8cf0ac6e582c9 WatchSource:0}: Error finding container a7955c8a6447e43d09685fd73f92841b84c03be3c5b6cd0faec8cf0ac6e582c9: Status 404 returned error can't find the container with id a7955c8a6447e43d09685fd73f92841b84c03be3c5b6cd0faec8cf0ac6e582c9 Dec 01 00:20:12 crc kubenswrapper[4911]: E1201 00:20:12.777900 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 01 00:20:12 crc kubenswrapper[4911]: E1201 00:20:12.778736 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvqrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-hmcx6_openshift-operators(b29bd7a0-85c7-43bc-8bab-adcafae9d8dc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:12 crc kubenswrapper[4911]: E1201 00:20:12.780850 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" podUID="b29bd7a0-85c7-43bc-8bab-adcafae9d8dc" Dec 01 00:20:12 crc kubenswrapper[4911]: I1201 00:20:12.913393 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" event={"ID":"c870d438-0de1-4af7-b63e-7f9d90128a0e","Type":"ContainerStarted","Data":"a7955c8a6447e43d09685fd73f92841b84c03be3c5b6cd0faec8cf0ac6e582c9"} Dec 01 00:20:12 crc kubenswrapper[4911]: E1201 00:20:12.914391 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" podUID="b29bd7a0-85c7-43bc-8bab-adcafae9d8dc" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.253539 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.253723 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-78b478d44c-rvs92_openshift-operators(ac116d8a-ec46-415a-b9bb-357493c28dda): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.255026 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" podUID="ac116d8a-ec46-415a-b9bb-357493c28dda" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.783011 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.783206 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vx944,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-twxt7_openshift-operators(3dcb1345-86cc-4712-a549-5ec7b06343f3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.784674 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" podUID="3dcb1345-86cc-4712-a549-5ec7b06343f3" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.787802 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.787918 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh_openshift-operators(6b5103e9-aa5a-402b-a755-a2f2be984479): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.789103 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" podUID="6b5103e9-aa5a-402b-a755-a2f2be984479" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.919037 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" podUID="6b5103e9-aa5a-402b-a755-a2f2be984479" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.921576 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" podUID="ac116d8a-ec46-415a-b9bb-357493c28dda" Dec 01 00:20:13 crc kubenswrapper[4911]: E1201 00:20:13.921676 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" podUID="3dcb1345-86cc-4712-a549-5ec7b06343f3" Dec 01 00:20:15 crc kubenswrapper[4911]: I1201 00:20:15.331103 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-b69jt"] Dec 01 00:20:15 crc kubenswrapper[4911]: W1201 00:20:15.338713 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64b5c170_0885_429f_b015_6443b497472e.slice/crio-24ec7dcd8340719826771dfd6157219f7a9adaa32668e3bf24e3b6707bcfbd28 WatchSource:0}: Error finding container 24ec7dcd8340719826771dfd6157219f7a9adaa32668e3bf24e3b6707bcfbd28: Status 404 returned error can't find the container with id 24ec7dcd8340719826771dfd6157219f7a9adaa32668e3bf24e3b6707bcfbd28 Dec 01 00:20:15 crc kubenswrapper[4911]: E1201 00:20:15.443271 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 01 00:20:15 crc kubenswrapper[4911]: E1201 00:20:15.443472 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kvzxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-9sg5q_openshift-operators(b2a8cc4e-d5e8-4825-a42b-9b8534030ff8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:15 crc kubenswrapper[4911]: E1201 00:20:15.444710 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" podUID="b2a8cc4e-d5e8-4825-a42b-9b8534030ff8" Dec 01 00:20:15 crc kubenswrapper[4911]: I1201 00:20:15.931586 4911 generic.go:334] "Generic (PLEG): container finished" podID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerID="088fa37e30b35217333541aa5028487fe640cffceda455e94f7e6c249105f2bd" exitCode=0 Dec 01 00:20:15 crc kubenswrapper[4911]: I1201 00:20:15.931921 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" event={"ID":"7d8f6326-b67c-4682-a37b-a79fb151552c","Type":"ContainerDied","Data":"088fa37e30b35217333541aa5028487fe640cffceda455e94f7e6c249105f2bd"} Dec 01 00:20:15 crc kubenswrapper[4911]: I1201 00:20:15.937454 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" event={"ID":"64b5c170-0885-429f-b015-6443b497472e","Type":"ContainerStarted","Data":"24ec7dcd8340719826771dfd6157219f7a9adaa32668e3bf24e3b6707bcfbd28"} Dec 01 00:20:15 crc kubenswrapper[4911]: E1201 00:20:15.939352 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" podUID="b2a8cc4e-d5e8-4825-a42b-9b8534030ff8" Dec 01 00:20:16 crc kubenswrapper[4911]: I1201 00:20:16.947433 4911 generic.go:334] "Generic (PLEG): container finished" podID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerID="cafb209a0079fe3b94d400506c6e23af7894ba16debfe6d4b461ccf000382ada" exitCode=0 Dec 01 00:20:16 crc kubenswrapper[4911]: I1201 00:20:16.947513 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" event={"ID":"7d8f6326-b67c-4682-a37b-a79fb151552c","Type":"ContainerDied","Data":"cafb209a0079fe3b94d400506c6e23af7894ba16debfe6d4b461ccf000382ada"} Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.482285 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.624380 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:20:18 crc kubenswrapper[4911]: E1201 00:20:18.624930 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="pull" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.624954 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="pull" Dec 01 00:20:18 crc kubenswrapper[4911]: E1201 00:20:18.624979 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="util" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.624987 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="util" Dec 01 00:20:18 crc kubenswrapper[4911]: E1201 00:20:18.625006 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="extract" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.625014 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="extract" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.625167 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8f6326-b67c-4682-a37b-a79fb151552c" containerName="extract" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.626172 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.634725 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.663778 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle\") pod \"7d8f6326-b67c-4682-a37b-a79fb151552c\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.663844 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pz6q\" (UniqueName: \"kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q\") pod \"7d8f6326-b67c-4682-a37b-a79fb151552c\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.663875 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util\") pod \"7d8f6326-b67c-4682-a37b-a79fb151552c\" (UID: \"7d8f6326-b67c-4682-a37b-a79fb151552c\") " Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.663972 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.664016 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.664041 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5bl\" (UniqueName: \"kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.665407 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle" (OuterVolumeSpecName: "bundle") pod "7d8f6326-b67c-4682-a37b-a79fb151552c" (UID: "7d8f6326-b67c-4682-a37b-a79fb151552c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.670301 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q" (OuterVolumeSpecName: "kube-api-access-8pz6q") pod "7d8f6326-b67c-4682-a37b-a79fb151552c" (UID: "7d8f6326-b67c-4682-a37b-a79fb151552c"). InnerVolumeSpecName "kube-api-access-8pz6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.690070 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util" (OuterVolumeSpecName: "util") pod "7d8f6326-b67c-4682-a37b-a79fb151552c" (UID: "7d8f6326-b67c-4682-a37b-a79fb151552c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765333 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5bl\" (UniqueName: \"kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765513 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765573 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pz6q\" (UniqueName: \"kubernetes.io/projected/7d8f6326-b67c-4682-a37b-a79fb151552c-kube-api-access-8pz6q\") on node \"crc\" DevicePath \"\"" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765589 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.765603 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7d8f6326-b67c-4682-a37b-a79fb151552c-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.766206 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.766201 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.784874 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5bl\" (UniqueName: \"kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl\") pod \"redhat-operators-zhl82\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.940776 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.971732 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" event={"ID":"7d8f6326-b67c-4682-a37b-a79fb151552c","Type":"ContainerDied","Data":"56a0137fe5b620cd0eb0bccf4bbd2195723ea63071dc2898051f23a59609cdbb"} Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.971811 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a0137fe5b620cd0eb0bccf4bbd2195723ea63071dc2898051f23a59609cdbb" Dec 01 00:20:18 crc kubenswrapper[4911]: I1201 00:20:18.971923 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77" Dec 01 00:20:19 crc kubenswrapper[4911]: I1201 00:20:19.185759 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:20:19 crc kubenswrapper[4911]: W1201 00:20:19.194660 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463d1ab1_c7b7_4fbe_aa8f_7bb459d7294d.slice/crio-bcedeebb8aa173b05fc96fc35c14a66d3d76e193f990d0ce1cdd473a604b08f6 WatchSource:0}: Error finding container bcedeebb8aa173b05fc96fc35c14a66d3d76e193f990d0ce1cdd473a604b08f6: Status 404 returned error can't find the container with id bcedeebb8aa173b05fc96fc35c14a66d3d76e193f990d0ce1cdd473a604b08f6 Dec 01 00:20:19 crc kubenswrapper[4911]: I1201 00:20:19.981960 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" event={"ID":"c870d438-0de1-4af7-b63e-7f9d90128a0e","Type":"ContainerStarted","Data":"9817e90fe6ea471862f052e14be4f3187fceace482591203ee222594095c70fb"} Dec 01 00:20:19 crc kubenswrapper[4911]: I1201 00:20:19.988878 4911 generic.go:334] "Generic (PLEG): container finished" podID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerID="ad946bd58b737b26bd8c204c60f77460d295aa042175fff80f3b2849d36c5bcd" exitCode=0 Dec 01 00:20:19 crc kubenswrapper[4911]: I1201 00:20:19.988920 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerDied","Data":"ad946bd58b737b26bd8c204c60f77460d295aa042175fff80f3b2849d36c5bcd"} Dec 01 00:20:19 crc kubenswrapper[4911]: I1201 00:20:19.988942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerStarted","Data":"bcedeebb8aa173b05fc96fc35c14a66d3d76e193f990d0ce1cdd473a604b08f6"} Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.011697 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-b4fbd6456-hj9jg" podStartSLOduration=20.696081363 podStartE2EDuration="27.011676433s" podCreationTimestamp="2025-12-01 00:19:53 +0000 UTC" firstStartedPulling="2025-12-01 00:20:12.7344288 +0000 UTC m=+772.873125571" lastFinishedPulling="2025-12-01 00:20:19.05002387 +0000 UTC m=+779.188720641" observedRunningTime="2025-12-01 00:20:20.008541735 +0000 UTC m=+780.147238526" watchObservedRunningTime="2025-12-01 00:20:20.011676433 +0000 UTC m=+780.150373204" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.367118 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.374000 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.377505 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.378008 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.378651 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.378953 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.379138 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.379315 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.379780 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-xkw95" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.380062 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.381605 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.392077 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.392969 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393008 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393036 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/97613b36-7079-4bac-afc8-0c933bcb2d4d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393053 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393074 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393091 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393106 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393121 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393148 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393168 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393184 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393200 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393216 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393233 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.393248 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494259 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494307 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494344 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494362 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494378 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494451 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494494 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494516 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/97613b36-7079-4bac-afc8-0c933bcb2d4d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494560 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494590 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494610 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.494628 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.495320 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.495512 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.495683 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.495804 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.495942 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.496105 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.496757 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.497846 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.500096 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.500307 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.500691 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/97613b36-7079-4bac-afc8-0c933bcb2d4d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.500977 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.501068 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.501658 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.502094 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/97613b36-7079-4bac-afc8-0c933bcb2d4d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"97613b36-7079-4bac-afc8-0c933bcb2d4d\") " pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.703995 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.956318 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Dec 01 00:20:20 crc kubenswrapper[4911]: W1201 00:20:20.969773 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97613b36_7079_4bac_afc8_0c933bcb2d4d.slice/crio-cec5686fe1e1a1f84bb5df1656188acc353ee8946c08252e7d7d52ef47d1e652 WatchSource:0}: Error finding container cec5686fe1e1a1f84bb5df1656188acc353ee8946c08252e7d7d52ef47d1e652: Status 404 returned error can't find the container with id cec5686fe1e1a1f84bb5df1656188acc353ee8946c08252e7d7d52ef47d1e652 Dec 01 00:20:20 crc kubenswrapper[4911]: I1201 00:20:20.996333 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"97613b36-7079-4bac-afc8-0c933bcb2d4d","Type":"ContainerStarted","Data":"cec5686fe1e1a1f84bb5df1656188acc353ee8946c08252e7d7d52ef47d1e652"} Dec 01 00:20:21 crc kubenswrapper[4911]: I1201 00:20:21.311653 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:20:21 crc kubenswrapper[4911]: I1201 00:20:21.311726 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.790128 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9"] Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.793828 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.797436 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.797693 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.797982 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vrsml" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.805432 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9"] Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.957942 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cn57\" (UniqueName: \"kubernetes.io/projected/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-kube-api-access-7cn57\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:30 crc kubenswrapper[4911]: I1201 00:20:30.958024 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:31 crc kubenswrapper[4911]: I1201 00:20:31.059321 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cn57\" (UniqueName: \"kubernetes.io/projected/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-kube-api-access-7cn57\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:31 crc kubenswrapper[4911]: I1201 00:20:31.059786 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:31 crc kubenswrapper[4911]: I1201 00:20:31.060315 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:31 crc kubenswrapper[4911]: I1201 00:20:31.083834 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cn57\" (UniqueName: \"kubernetes.io/projected/afe1284b-6de6-47cc-aa68-7ad5b1acdf7f-kube-api-access-7cn57\") pod \"cert-manager-operator-controller-manager-5446d6888b-6lkv9\" (UID: \"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:31 crc kubenswrapper[4911]: I1201 00:20:31.272346 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.427427 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.428961 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.437235 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.437864 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.439875 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.440374 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.442418 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509548 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509638 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s6n\" (UniqueName: \"kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509673 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509778 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509830 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509893 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509917 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.509947 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.510012 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.510042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.510089 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.510149 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611340 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611646 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611693 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611724 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611756 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.611929 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.612027 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s6n\" (UniqueName: \"kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.612063 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.612477 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.612946 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.613678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.613742 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.613828 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.613858 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.613912 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.614600 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.621010 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.621081 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.622228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.622881 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.623304 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.623514 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.623684 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.634220 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s6n\" (UniqueName: \"kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n\") pod \"service-telemetry-operator-1-build\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:33 crc kubenswrapper[4911]: I1201 00:20:33.762829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:20:36 crc kubenswrapper[4911]: E1201 00:20:36.310750 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Dec 01 00:20:36 crc kubenswrapper[4911]: E1201 00:20:36.311366 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-48lmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-b69jt_service-telemetry(64b5c170-0885-429f-b015-6443b497472e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:20:36 crc kubenswrapper[4911]: E1201 00:20:36.312726 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" podUID="64b5c170-0885-429f-b015-6443b497472e" Dec 01 00:20:36 crc kubenswrapper[4911]: I1201 00:20:36.725422 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:20:36 crc kubenswrapper[4911]: I1201 00:20:36.977143 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9"] Dec 01 00:20:37 crc kubenswrapper[4911]: I1201 00:20:37.107369 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerStarted","Data":"c7eac1f63a501d6a6d070d9a9bd0681b12e1c917fd7b972c55f490dfb3b4fb83"} Dec 01 00:20:37 crc kubenswrapper[4911]: I1201 00:20:37.108570 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"3fe3641c-d59c-47d9-9cf2-74f2a62481f7","Type":"ContainerStarted","Data":"8c499e04ac3cab124f33b38a97e0854cb0ea546538b785ba545d7720907a10c7"} Dec 01 00:20:39 crc kubenswrapper[4911]: I1201 00:20:39.124778 4911 generic.go:334] "Generic (PLEG): container finished" podID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerID="c7eac1f63a501d6a6d070d9a9bd0681b12e1c917fd7b972c55f490dfb3b4fb83" exitCode=0 Dec 01 00:20:39 crc kubenswrapper[4911]: I1201 00:20:39.124880 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerDied","Data":"c7eac1f63a501d6a6d070d9a9bd0681b12e1c917fd7b972c55f490dfb3b4fb83"} Dec 01 00:20:39 crc kubenswrapper[4911]: E1201 00:20:39.312724 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" podUID="64b5c170-0885-429f-b015-6443b497472e" Dec 01 00:20:39 crc kubenswrapper[4911]: W1201 00:20:39.323105 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe1284b_6de6_47cc_aa68_7ad5b1acdf7f.slice/crio-4eb1a95a948d4f2680135b0807ff23cf856c0afa2dc297099ee14bbee16e2ce0 WatchSource:0}: Error finding container 4eb1a95a948d4f2680135b0807ff23cf856c0afa2dc297099ee14bbee16e2ce0: Status 404 returned error can't find the container with id 4eb1a95a948d4f2680135b0807ff23cf856c0afa2dc297099ee14bbee16e2ce0 Dec 01 00:20:40 crc kubenswrapper[4911]: I1201 00:20:40.134269 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" event={"ID":"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f","Type":"ContainerStarted","Data":"4eb1a95a948d4f2680135b0807ff23cf856c0afa2dc297099ee14bbee16e2ce0"} Dec 01 00:20:43 crc kubenswrapper[4911]: I1201 00:20:43.774899 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.207686 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.209493 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211703 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211762 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211815 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211837 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211891 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.211914 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vg7l\" (UniqueName: \"kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.212014 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.212096 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.212899 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.213527 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.245989 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.247287 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.261767 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.261817 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.261890 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.261990 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363786 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363830 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363852 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363873 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363899 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363922 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363940 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363963 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363981 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.363999 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364021 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vg7l\" (UniqueName: \"kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364048 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364476 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364684 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364774 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.364861 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.365109 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.365368 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.365852 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.365891 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.366371 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.379231 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.380255 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vg7l\" (UniqueName: \"kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.380763 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:46 crc kubenswrapper[4911]: I1201 00:20:46.571838 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:20:51 crc kubenswrapper[4911]: I1201 00:20:51.312151 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:20:51 crc kubenswrapper[4911]: I1201 00:20:51.312981 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:20:51 crc kubenswrapper[4911]: I1201 00:20:51.313032 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:20:51 crc kubenswrapper[4911]: I1201 00:20:51.314003 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:20:51 crc kubenswrapper[4911]: I1201 00:20:51.314058 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7" gracePeriod=600 Dec 01 00:20:52 crc kubenswrapper[4911]: I1201 00:20:52.325632 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7" exitCode=0 Dec 01 00:20:52 crc kubenswrapper[4911]: I1201 00:20:52.325677 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7"} Dec 01 00:20:52 crc kubenswrapper[4911]: I1201 00:20:52.325708 4911 scope.go:117] "RemoveContainer" containerID="5394cc273ac4360711d88e7051016a4910c1d8259c73c2bc9b3a4811b5f60a4d" Dec 01 00:21:08 crc kubenswrapper[4911]: E1201 00:21:08.399700 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Dec 01 00:21:08 crc kubenswrapper[4911]: E1201 00:21:08.400736 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-48lmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-b69jt_service-telemetry(64b5c170-0885-429f-b015-6443b497472e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:21:08 crc kubenswrapper[4911]: E1201 00:21:08.402061 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" podUID="64b5c170-0885-429f-b015-6443b497472e" Dec 01 00:21:10 crc kubenswrapper[4911]: E1201 00:21:10.704175 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911" Dec 01 00:21:10 crc kubenswrapper[4911]: E1201 00:21:10.704387 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-operator,Image:registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911,Command:[/usr/bin/cert-manager-operator],Args:[start --v=$(OPERATOR_LOG_LEVEL) --trusted-ca-configmap=$(TRUSTED_CA_CONFIGMAP_NAME) --cloud-credentials-secret=$(CLOUD_CREDENTIALS_SECRET_NAME) --unsupported-addon-features=$(UNSUPPORTED_ADDON_FEATURES)],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:cert-manager-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_WEBHOOK,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_CA_INJECTOR,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_CONTROLLER,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_ACMESOLVER,Value:registry.redhat.io/cert-manager/jetstack-cert-manager-acmesolver-rhel9@sha256:ba937fc4b9eee31422914352c11a45b90754ba4fbe490ea45249b90afdc4e0a7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CERT_MANAGER_ISTIOCSR,Value:registry.redhat.io/cert-manager/cert-manager-istio-csr-rhel9@sha256:af1ac813b8ee414ef215936f05197bc498bccbd540f3e2a93cb522221ba112bc,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.18.3,ValueFrom:nil,},EnvVar{Name:ISTIOCSR_OPERAND_IMAGE_VERSION,Value:0.14.2,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:1.18.0,ValueFrom:nil,},EnvVar{Name:OPERATOR_LOG_LEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:TRUSTED_CA_CONFIGMAP_NAME,Value:,ValueFrom:nil,},EnvVar{Name:CLOUD_CREDENTIALS_SECRET_NAME,Value:,ValueFrom:nil,},EnvVar{Name:UNSUPPORTED_ADDON_FEATURES,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cert-manager-operator.v1.18.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{33554432 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7cn57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-operator-controller-manager-5446d6888b-6lkv9_cert-manager-operator(afe1284b-6de6-47cc-aa68-7ad5b1acdf7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:21:10 crc kubenswrapper[4911]: E1201 00:21:10.705637 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" podUID="afe1284b-6de6-47cc-aa68-7ad5b1acdf7f" Dec 01 00:21:11 crc kubenswrapper[4911]: E1201 00:21:11.564849 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/cert-manager-operator-rhel9@sha256:fa8de363ab4435c1085ac37f1bad488828c6ae8ba361c5f865c27ef577610911\\\"\"" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" podUID="afe1284b-6de6-47cc-aa68-7ad5b1acdf7f" Dec 01 00:21:13 crc kubenswrapper[4911]: I1201 00:21:13.408254 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Dec 01 00:21:19 crc kubenswrapper[4911]: I1201 00:21:19.626809 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3"} Dec 01 00:21:19 crc kubenswrapper[4911]: I1201 00:21:19.629737 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerStarted","Data":"94e7fd7c26441071f0ea66a61f0d6a953eadacc374e0ee2ab6215d901141d5a3"} Dec 01 00:21:20 crc kubenswrapper[4911]: I1201 00:21:20.636053 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" event={"ID":"b29bd7a0-85c7-43bc-8bab-adcafae9d8dc","Type":"ContainerStarted","Data":"9453ed42ae6f39311265c171b5e15af960dd1e14d6f86064e3e6fdfff973304f"} Dec 01 00:21:20 crc kubenswrapper[4911]: I1201 00:21:20.637107 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:21:20 crc kubenswrapper[4911]: I1201 00:21:20.639162 4911 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-hmcx6 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.45:8081/healthz\": dial tcp 10.217.0.45:8081: connect: connection refused" start-of-body= Dec 01 00:21:20 crc kubenswrapper[4911]: I1201 00:21:20.639208 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" podUID="b29bd7a0-85c7-43bc-8bab-adcafae9d8dc" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.45:8081/healthz\": dial tcp 10.217.0.45:8081: connect: connection refused" Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.653589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" event={"ID":"3dcb1345-86cc-4712-a549-5ec7b06343f3","Type":"ContainerStarted","Data":"bf9b8fc7999ea7823b7e37079d420520f95900683d5eea5e4996048e9d92aedb"} Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.656604 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" event={"ID":"6b5103e9-aa5a-402b-a755-a2f2be984479","Type":"ContainerStarted","Data":"1f8824f62c74cc849f16e46dccd47a13428409621ffb0adb41ed932c4a46a49c"} Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.664580 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.682606 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-hmcx6" podStartSLOduration=25.301956485 podStartE2EDuration="1m46.68258704s" podCreationTimestamp="2025-12-01 00:19:35 +0000 UTC" firstStartedPulling="2025-12-01 00:19:36.890049638 +0000 UTC m=+737.028746409" lastFinishedPulling="2025-12-01 00:20:58.270680183 +0000 UTC m=+818.409376964" observedRunningTime="2025-12-01 00:21:20.677690198 +0000 UTC m=+840.816386989" watchObservedRunningTime="2025-12-01 00:21:21.68258704 +0000 UTC m=+841.821283811" Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.714380 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-twxt7" podStartSLOduration=15.082466611 podStartE2EDuration="1m46.714345567s" podCreationTimestamp="2025-12-01 00:19:35 +0000 UTC" firstStartedPulling="2025-12-01 00:19:36.764420388 +0000 UTC m=+736.903117159" lastFinishedPulling="2025-12-01 00:21:08.396299304 +0000 UTC m=+828.534996115" observedRunningTime="2025-12-01 00:21:21.681751787 +0000 UTC m=+841.820448558" watchObservedRunningTime="2025-12-01 00:21:21.714345567 +0000 UTC m=+841.853042338" Dec 01 00:21:21 crc kubenswrapper[4911]: I1201 00:21:21.747745 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh" podStartSLOduration=15.215880948 podStartE2EDuration="1m46.74772487s" podCreationTimestamp="2025-12-01 00:19:35 +0000 UTC" firstStartedPulling="2025-12-01 00:19:36.862660982 +0000 UTC m=+737.001357753" lastFinishedPulling="2025-12-01 00:21:08.394504864 +0000 UTC m=+828.533201675" observedRunningTime="2025-12-01 00:21:21.739276184 +0000 UTC m=+841.877972965" watchObservedRunningTime="2025-12-01 00:21:21.74772487 +0000 UTC m=+841.886421641" Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.664527 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerStarted","Data":"9f7a3853b5f9bdbb073a0705456bf6d14e6d1642bfc63e985ac20061be4500fa"} Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.666150 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"3fe3641c-d59c-47d9-9cf2-74f2a62481f7","Type":"ContainerStarted","Data":"6248ece68b58f4766c8ebcb162098d955728ae8683705105aaa74ad145403186"} Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.668243 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" event={"ID":"ac116d8a-ec46-415a-b9bb-357493c28dda","Type":"ContainerStarted","Data":"900de8773ad5afef32bde5d90f56067645cc5fc0fba3bd13aa59ecdc95b7ac71"} Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.669938 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" event={"ID":"b2a8cc4e-d5e8-4825-a42b-9b8534030ff8","Type":"ContainerStarted","Data":"255144c3718db643711abfdcd44132eb0a04f5c7f3c764e73ab4324a780c8966"} Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.670254 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.690615 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zhl82" podStartSLOduration=11.792494758 podStartE2EDuration="1m4.690590019s" podCreationTimestamp="2025-12-01 00:20:18 +0000 UTC" firstStartedPulling="2025-12-01 00:20:19.989910555 +0000 UTC m=+780.128607326" lastFinishedPulling="2025-12-01 00:21:12.888005786 +0000 UTC m=+833.026702587" observedRunningTime="2025-12-01 00:21:22.683589423 +0000 UTC m=+842.822286214" watchObservedRunningTime="2025-12-01 00:21:22.690590019 +0000 UTC m=+842.829286790" Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.707251 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78b478d44c-rvs92" podStartSLOduration=11.706846349 podStartE2EDuration="1m47.707229534s" podCreationTimestamp="2025-12-01 00:19:35 +0000 UTC" firstStartedPulling="2025-12-01 00:19:36.88764555 +0000 UTC m=+737.026342322" lastFinishedPulling="2025-12-01 00:21:12.888028696 +0000 UTC m=+833.026725507" observedRunningTime="2025-12-01 00:21:22.69994793 +0000 UTC m=+842.838644711" watchObservedRunningTime="2025-12-01 00:21:22.707229534 +0000 UTC m=+842.845926305" Dec 01 00:21:22 crc kubenswrapper[4911]: I1201 00:21:22.735007 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" podStartSLOduration=16.38611842 podStartE2EDuration="1m47.734977039s" podCreationTimestamp="2025-12-01 00:19:35 +0000 UTC" firstStartedPulling="2025-12-01 00:19:37.045343427 +0000 UTC m=+737.184040238" lastFinishedPulling="2025-12-01 00:21:08.394202046 +0000 UTC m=+828.532898857" observedRunningTime="2025-12-01 00:21:22.73213501 +0000 UTC m=+842.870831771" watchObservedRunningTime="2025-12-01 00:21:22.734977039 +0000 UTC m=+842.873673810" Dec 01 00:21:23 crc kubenswrapper[4911]: I1201 00:21:23.677262 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerStarted","Data":"aac432f1193248e609fa8558884c6c6cc89b624302d1cf978e227261660bf4e6"} Dec 01 00:21:23 crc kubenswrapper[4911]: I1201 00:21:23.677572 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" containerName="manage-dockerfile" containerID="cri-o://6248ece68b58f4766c8ebcb162098d955728ae8683705105aaa74ad145403186" gracePeriod=30 Dec 01 00:21:24 crc kubenswrapper[4911]: E1201 00:21:24.095759 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Dec 01 00:21:24 crc kubenswrapper[4911]: E1201 00:21:24.095968 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(97613b36-7079-4bac-afc8-0c933bcb2d4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 00:21:24 crc kubenswrapper[4911]: E1201 00:21:24.097183 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:21:24 crc kubenswrapper[4911]: E1201 00:21:24.153420 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" podUID="64b5c170-0885-429f-b015-6443b497472e" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.687389 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_3fe3641c-d59c-47d9-9cf2-74f2a62481f7/manage-dockerfile/0.log" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.687837 4911 generic.go:334] "Generic (PLEG): container finished" podID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" containerID="6248ece68b58f4766c8ebcb162098d955728ae8683705105aaa74ad145403186" exitCode=1 Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.688451 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"3fe3641c-d59c-47d9-9cf2-74f2a62481f7","Type":"ContainerDied","Data":"6248ece68b58f4766c8ebcb162098d955728ae8683705105aaa74ad145403186"} Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.688507 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"3fe3641c-d59c-47d9-9cf2-74f2a62481f7","Type":"ContainerDied","Data":"8c499e04ac3cab124f33b38a97e0854cb0ea546538b785ba545d7720907a10c7"} Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.688527 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c499e04ac3cab124f33b38a97e0854cb0ea546538b785ba545d7720907a10c7" Dec 01 00:21:24 crc kubenswrapper[4911]: E1201 00:21:24.689252 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.705734 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_3fe3641c-d59c-47d9-9cf2-74f2a62481f7/manage-dockerfile/0.log" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.705815 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787051 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787112 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787142 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787178 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787208 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787229 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787258 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787277 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787305 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787312 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787335 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787361 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787413 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s6n\" (UniqueName: \"kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n\") pod \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\" (UID: \"3fe3641c-d59c-47d9-9cf2-74f2a62481f7\") " Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787570 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.787879 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788022 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788103 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788125 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788140 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788172 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788165 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788366 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788796 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.788799 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.797595 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.799602 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.799756 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n" (OuterVolumeSpecName: "kube-api-access-h8s6n") pod "3fe3641c-d59c-47d9-9cf2-74f2a62481f7" (UID: "3fe3641c-d59c-47d9-9cf2-74f2a62481f7"). InnerVolumeSpecName "kube-api-access-h8s6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.819760 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.846531 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889310 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889360 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889378 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889389 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889400 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889410 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s6n\" (UniqueName: \"kubernetes.io/projected/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-kube-api-access-h8s6n\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889420 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889436 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:24 crc kubenswrapper[4911]: I1201 00:21:24.889447 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/3fe3641c-d59c-47d9-9cf2-74f2a62481f7-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:25 crc kubenswrapper[4911]: I1201 00:21:25.695019 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Dec 01 00:21:25 crc kubenswrapper[4911]: E1201 00:21:25.698253 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:21:25 crc kubenswrapper[4911]: I1201 00:21:25.745297 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:21:25 crc kubenswrapper[4911]: I1201 00:21:25.752684 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Dec 01 00:21:26 crc kubenswrapper[4911]: I1201 00:21:26.178825 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" path="/var/lib/kubelet/pods/3fe3641c-d59c-47d9-9cf2-74f2a62481f7/volumes" Dec 01 00:21:26 crc kubenswrapper[4911]: E1201 00:21:26.824693 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:21:26 crc kubenswrapper[4911]: I1201 00:21:26.838112 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-9sg5q" Dec 01 00:21:27 crc kubenswrapper[4911]: I1201 00:21:27.708693 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" event={"ID":"afe1284b-6de6-47cc-aa68-7ad5b1acdf7f","Type":"ContainerStarted","Data":"a30f9389c728da28a6986e4d520aaf2e42a3677fd742c572bcaf74fa038aaa44"} Dec 01 00:21:27 crc kubenswrapper[4911]: I1201 00:21:27.785662 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-6lkv9" podStartSLOduration=16.285406854 podStartE2EDuration="57.78564642s" podCreationTimestamp="2025-12-01 00:20:30 +0000 UTC" firstStartedPulling="2025-12-01 00:20:45.326326323 +0000 UTC m=+805.465023094" lastFinishedPulling="2025-12-01 00:21:26.826565889 +0000 UTC m=+846.965262660" observedRunningTime="2025-12-01 00:21:27.780442275 +0000 UTC m=+847.919139066" watchObservedRunningTime="2025-12-01 00:21:27.78564642 +0000 UTC m=+847.924343191" Dec 01 00:21:28 crc kubenswrapper[4911]: I1201 00:21:28.943409 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:28 crc kubenswrapper[4911]: I1201 00:21:28.943516 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:29 crc kubenswrapper[4911]: I1201 00:21:29.007984 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:29 crc kubenswrapper[4911]: I1201 00:21:29.800032 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:29 crc kubenswrapper[4911]: I1201 00:21:29.865610 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:21:31 crc kubenswrapper[4911]: I1201 00:21:31.734113 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zhl82" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="registry-server" containerID="cri-o://9f7a3853b5f9bdbb073a0705456bf6d14e6d1642bfc63e985ac20061be4500fa" gracePeriod=2 Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.446876 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z25xd"] Dec 01 00:21:33 crc kubenswrapper[4911]: E1201 00:21:33.448491 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" containerName="manage-dockerfile" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.448583 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" containerName="manage-dockerfile" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.448742 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe3641c-d59c-47d9-9cf2-74f2a62481f7" containerName="manage-dockerfile" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.449186 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.451817 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.452029 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8glbm" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.452655 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.460897 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z25xd"] Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.543073 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.543131 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26nlf\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-kube-api-access-26nlf\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.644386 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.644889 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26nlf\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-kube-api-access-26nlf\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.664205 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26nlf\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-kube-api-access-26nlf\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.678331 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bebc7141-b369-409e-8629-25e95690723b-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z25xd\" (UID: \"bebc7141-b369-409e-8629-25e95690723b\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.750134 4911 generic.go:334] "Generic (PLEG): container finished" podID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerID="9f7a3853b5f9bdbb073a0705456bf6d14e6d1642bfc63e985ac20061be4500fa" exitCode=0 Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.750171 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerDied","Data":"9f7a3853b5f9bdbb073a0705456bf6d14e6d1642bfc63e985ac20061be4500fa"} Dec 01 00:21:33 crc kubenswrapper[4911]: I1201 00:21:33.769300 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.085672 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-q54sn"] Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.099171 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.105827 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2qz42" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.111297 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-q54sn"] Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.158130 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7fr\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-kube-api-access-pq7fr\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.158216 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.245500 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.248803 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z25xd"] Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.266111 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7fr\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-kube-api-access-pq7fr\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.266236 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.321710 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7fr\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-kube-api-access-pq7fr\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.328028 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0daadb0a-db56-459b-b756-0f57d9dc0529-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-q54sn\" (UID: \"0daadb0a-db56-459b-b756-0f57d9dc0529\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.367057 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content\") pod \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.367147 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw5bl\" (UniqueName: \"kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl\") pod \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.367202 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities\") pod \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\" (UID: \"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d\") " Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.368117 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities" (OuterVolumeSpecName: "utilities") pod "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" (UID: "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.377781 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl" (OuterVolumeSpecName: "kube-api-access-lw5bl") pod "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" (UID: "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d"). InnerVolumeSpecName "kube-api-access-lw5bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.469178 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.469216 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw5bl\" (UniqueName: \"kubernetes.io/projected/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-kube-api-access-lw5bl\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.478927 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.481845 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" (UID: "463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.570064 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.758620 4911 generic.go:334] "Generic (PLEG): container finished" podID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerID="aac432f1193248e609fa8558884c6c6cc89b624302d1cf978e227261660bf4e6" exitCode=0 Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.758684 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerDied","Data":"aac432f1193248e609fa8558884c6c6cc89b624302d1cf978e227261660bf4e6"} Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.765619 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" event={"ID":"bebc7141-b369-409e-8629-25e95690723b","Type":"ContainerStarted","Data":"40955f4d08b80c8762cf5382302e4cf1ccd87a7231f6ed9ebbf602917d76cb9a"} Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.770890 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhl82" event={"ID":"463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d","Type":"ContainerDied","Data":"bcedeebb8aa173b05fc96fc35c14a66d3d76e193f990d0ce1cdd473a604b08f6"} Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.770928 4911 scope.go:117] "RemoveContainer" containerID="9f7a3853b5f9bdbb073a0705456bf6d14e6d1642bfc63e985ac20061be4500fa" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.771028 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhl82" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.806656 4911 scope.go:117] "RemoveContainer" containerID="c7eac1f63a501d6a6d070d9a9bd0681b12e1c917fd7b972c55f490dfb3b4fb83" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.832592 4911 scope.go:117] "RemoveContainer" containerID="ad946bd58b737b26bd8c204c60f77460d295aa042175fff80f3b2849d36c5bcd" Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.847581 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:21:34 crc kubenswrapper[4911]: I1201 00:21:34.853129 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zhl82"] Dec 01 00:21:35 crc kubenswrapper[4911]: I1201 00:21:35.036383 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-q54sn"] Dec 01 00:21:35 crc kubenswrapper[4911]: I1201 00:21:35.780607 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerStarted","Data":"9d849824e2dd69d82bc2a0a73abdc8a6c0af4286277b3c33bd6fb7900b6f54ee"} Dec 01 00:21:35 crc kubenswrapper[4911]: I1201 00:21:35.781612 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" event={"ID":"0daadb0a-db56-459b-b756-0f57d9dc0529","Type":"ContainerStarted","Data":"f62adb824207ea5e2aba925b6f766a75a3f798abc24ba8d5b580ba572b79db15"} Dec 01 00:21:36 crc kubenswrapper[4911]: I1201 00:21:36.167221 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" path="/var/lib/kubelet/pods/463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d/volumes" Dec 01 00:21:37 crc kubenswrapper[4911]: I1201 00:21:37.799169 4911 generic.go:334] "Generic (PLEG): container finished" podID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerID="9d849824e2dd69d82bc2a0a73abdc8a6c0af4286277b3c33bd6fb7900b6f54ee" exitCode=0 Dec 01 00:21:37 crc kubenswrapper[4911]: I1201 00:21:37.799374 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerDied","Data":"9d849824e2dd69d82bc2a0a73abdc8a6c0af4286277b3c33bd6fb7900b6f54ee"} Dec 01 00:21:38 crc kubenswrapper[4911]: I1201 00:21:38.811710 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerStarted","Data":"54512a6cc8b4d5bfc4bce1df3c1dfd9bc56c34d1c08b8e9275875f9dcce22da9"} Dec 01 00:21:38 crc kubenswrapper[4911]: I1201 00:21:38.839241 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=52.839218213 podStartE2EDuration="52.839218213s" podCreationTimestamp="2025-12-01 00:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:21:38.836577039 +0000 UTC m=+858.975273820" watchObservedRunningTime="2025-12-01 00:21:38.839218213 +0000 UTC m=+858.977914984" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.323044 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-f775q"] Dec 01 00:21:43 crc kubenswrapper[4911]: E1201 00:21:43.323981 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="extract-utilities" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.323998 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="extract-utilities" Dec 01 00:21:43 crc kubenswrapper[4911]: E1201 00:21:43.324010 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="extract-content" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.324018 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="extract-content" Dec 01 00:21:43 crc kubenswrapper[4911]: E1201 00:21:43.324034 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="registry-server" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.324040 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="registry-server" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.324181 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="463d1ab1-c7b7-4fbe-aa8f-7bb459d7294d" containerName="registry-server" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.324611 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.326261 4911 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mwk87" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.336422 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-f775q"] Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.401071 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-bound-sa-token\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.401121 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvc69\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-kube-api-access-zvc69\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.502782 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvc69\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-kube-api-access-zvc69\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.502900 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-bound-sa-token\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.525943 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-bound-sa-token\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.527176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvc69\" (UniqueName: \"kubernetes.io/projected/5511c855-0f49-4d84-83da-32932f2e4b1a-kube-api-access-zvc69\") pod \"cert-manager-86cb77c54b-f775q\" (UID: \"5511c855-0f49-4d84-83da-32932f2e4b1a\") " pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:21:43 crc kubenswrapper[4911]: I1201 00:21:43.643194 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-f775q" Dec 01 00:22:38 crc kubenswrapper[4911]: I1201 00:22:38.161470 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-f775q"] Dec 01 00:22:39 crc kubenswrapper[4911]: E1201 00:22:39.005725 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Dec 01 00:22:39 crc kubenswrapper[4911]: E1201 00:22:39.005922 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(97613b36-7079-4bac-afc8-0c933bcb2d4d): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 00:22:39 crc kubenswrapper[4911]: E1201 00:22:39.007017 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:22:39 crc kubenswrapper[4911]: I1201 00:22:39.293489 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-f775q" event={"ID":"5511c855-0f49-4d84-83da-32932f2e4b1a","Type":"ContainerStarted","Data":"1ba46ac15a80823eba5f510c17b82db12b81fedd41e2999a6c4e56edb199561c"} Dec 01 00:22:40 crc kubenswrapper[4911]: I1201 00:22:40.302645 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" event={"ID":"64b5c170-0885-429f-b015-6443b497472e","Type":"ContainerStarted","Data":"450559ad150e3dbcae9ad50adcdc8912f9f7471c85f9c9fff45dbff6de14f629"} Dec 01 00:22:40 crc kubenswrapper[4911]: I1201 00:22:40.317362 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-b69jt" podStartSLOduration=26.651356308 podStartE2EDuration="2m50.317341294s" podCreationTimestamp="2025-12-01 00:19:50 +0000 UTC" firstStartedPulling="2025-12-01 00:20:15.340436835 +0000 UTC m=+775.479133606" lastFinishedPulling="2025-12-01 00:22:39.006421821 +0000 UTC m=+919.145118592" observedRunningTime="2025-12-01 00:22:40.313662501 +0000 UTC m=+920.452359272" watchObservedRunningTime="2025-12-01 00:22:40.317341294 +0000 UTC m=+920.456038065" Dec 01 00:22:41 crc kubenswrapper[4911]: E1201 00:22:41.216716 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df" Dec 01 00:22:41 crc kubenswrapper[4911]: E1201 00:22:41.217124 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-cainjector,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,Command:[/app/cmd/cainjector/cainjector],Args:[--leader-election-namespace=kube-system --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pq7fr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-cainjector-855d9ccff4-q54sn_cert-manager(0daadb0a-db56-459b-b756-0f57d9dc0529): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:22:41 crc kubenswrapper[4911]: E1201 00:22:41.218356 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" podUID="0daadb0a-db56-459b-b756-0f57d9dc0529" Dec 01 00:22:41 crc kubenswrapper[4911]: E1201 00:22:41.310147 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df\\\"\"" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" podUID="0daadb0a-db56-459b-b756-0f57d9dc0529" Dec 01 00:22:42 crc kubenswrapper[4911]: E1201 00:22:42.563541 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df" Dec 01 00:22:42 crc kubenswrapper[4911]: E1201 00:22:42.563747 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-webhook,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:29a0fa1c2f2a6cee62a0468a3883d16d491b4af29130dad6e3e2bb2948f274df,Command:[/app/cmd/webhook/webhook],Args:[--dynamic-serving-ca-secret-name=cert-manager-webhook-ca --dynamic-serving-ca-secret-namespace=$(POD_NAMESPACE) --dynamic-serving-dns-names=cert-manager-webhook,cert-manager-webhook.$(POD_NAMESPACE),cert-manager-webhook.$(POD_NAMESPACE).svc --secure-port=10250 --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:10250,Protocol:TCP,HostIP:,},ContainerPort{Name:healthcheck,HostPort:0,ContainerPort:6080,Protocol:TCP,HostIP:,},ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26nlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-webhook-f4fb5df64-z25xd_cert-manager(bebc7141-b369-409e-8629-25e95690723b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 00:22:42 crc kubenswrapper[4911]: E1201 00:22:42.564996 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" podUID="bebc7141-b369-409e-8629-25e95690723b" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.170991 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.172859 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.197927 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.286249 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.286529 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbgtp\" (UniqueName: \"kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.286615 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.399053 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.399215 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbgtp\" (UniqueName: \"kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.399260 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.399940 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.399982 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.424601 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbgtp\" (UniqueName: \"kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp\") pod \"community-operators-5sw59\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:49 crc kubenswrapper[4911]: I1201 00:22:49.600228 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:22:50 crc kubenswrapper[4911]: I1201 00:22:50.375589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" event={"ID":"bebc7141-b369-409e-8629-25e95690723b","Type":"ContainerStarted","Data":"39670639753d1dfac0dd46c5a7789832169772147d26fc453a442e2101f1b149"} Dec 01 00:22:50 crc kubenswrapper[4911]: I1201 00:22:50.400069 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:22:51 crc kubenswrapper[4911]: I1201 00:22:51.385841 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerStarted","Data":"7c25543709de68b73f804ff5add76008a98344b5ea449de94b5a7587dd6b1643"} Dec 01 00:22:52 crc kubenswrapper[4911]: E1201 00:22:52.154913 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" Dec 01 00:22:55 crc kubenswrapper[4911]: I1201 00:22:55.415540 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-f775q" event={"ID":"5511c855-0f49-4d84-83da-32932f2e4b1a","Type":"ContainerStarted","Data":"f1b035a8da4137d0665000200a34ec08555d930b203c2fcb42e0dca9e8710c9c"} Dec 01 00:22:56 crc kubenswrapper[4911]: I1201 00:22:56.423344 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerStarted","Data":"4c206d27569a4da7e6efad335746e76bba2648a9073a4dbd147664cf4156efe5"} Dec 01 00:22:56 crc kubenswrapper[4911]: I1201 00:22:56.423744 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:22:56 crc kubenswrapper[4911]: I1201 00:22:56.426280 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" Dec 01 00:22:56 crc kubenswrapper[4911]: I1201 00:22:56.442432 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-z25xd" podStartSLOduration=-9223371953.412367 podStartE2EDuration="1m23.442409151s" podCreationTimestamp="2025-12-01 00:21:33 +0000 UTC" firstStartedPulling="2025-12-01 00:21:34.266247462 +0000 UTC m=+854.404944233" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:22:56.439547811 +0000 UTC m=+936.578244592" watchObservedRunningTime="2025-12-01 00:22:56.442409151 +0000 UTC m=+936.581105932" Dec 01 00:22:57 crc kubenswrapper[4911]: I1201 00:22:57.435365 4911 generic.go:334] "Generic (PLEG): container finished" podID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerID="4c206d27569a4da7e6efad335746e76bba2648a9073a4dbd147664cf4156efe5" exitCode=0 Dec 01 00:22:57 crc kubenswrapper[4911]: I1201 00:22:57.436894 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerDied","Data":"4c206d27569a4da7e6efad335746e76bba2648a9073a4dbd147664cf4156efe5"} Dec 01 00:22:57 crc kubenswrapper[4911]: I1201 00:22:57.470428 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-f775q" podStartSLOduration=64.580800562 podStartE2EDuration="1m14.470397238s" podCreationTimestamp="2025-12-01 00:21:43 +0000 UTC" firstStartedPulling="2025-12-01 00:22:39.009218789 +0000 UTC m=+919.147915570" lastFinishedPulling="2025-12-01 00:22:48.898815465 +0000 UTC m=+929.037512246" observedRunningTime="2025-12-01 00:22:57.462049504 +0000 UTC m=+937.600746305" watchObservedRunningTime="2025-12-01 00:22:57.470397238 +0000 UTC m=+937.609094039" Dec 01 00:22:58 crc kubenswrapper[4911]: I1201 00:22:58.443161 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" event={"ID":"0daadb0a-db56-459b-b756-0f57d9dc0529","Type":"ContainerStarted","Data":"880fcb8581c1677e17e8c225fde0b7dc6d6b1c43101a5c30ebfd89ced16152c1"} Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.081981 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.084155 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.102759 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.165571 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.165811 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k84r4\" (UniqueName: \"kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.165973 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.267385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.267487 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k84r4\" (UniqueName: \"kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.267530 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.268089 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.268195 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.290911 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k84r4\" (UniqueName: \"kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4\") pod \"certified-operators-jwkb2\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.414149 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:04 crc kubenswrapper[4911]: I1201 00:23:04.938967 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:04 crc kubenswrapper[4911]: W1201 00:23:04.946124 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d84a9c_6e84_47ab_8e9c_8237c11e1420.slice/crio-273329e5f065b9e917bfebde7ba45ea3e32beb07aa647c8d3ddcaeea88486ef1 WatchSource:0}: Error finding container 273329e5f065b9e917bfebde7ba45ea3e32beb07aa647c8d3ddcaeea88486ef1: Status 404 returned error can't find the container with id 273329e5f065b9e917bfebde7ba45ea3e32beb07aa647c8d3ddcaeea88486ef1 Dec 01 00:23:05 crc kubenswrapper[4911]: I1201 00:23:05.497757 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerStarted","Data":"273329e5f065b9e917bfebde7ba45ea3e32beb07aa647c8d3ddcaeea88486ef1"} Dec 01 00:23:08 crc kubenswrapper[4911]: I1201 00:23:08.521178 4911 generic.go:334] "Generic (PLEG): container finished" podID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerID="a21595dba10be44c65f5e20199b1da0324eeefd307a960c140c736ce16f11312" exitCode=0 Dec 01 00:23:08 crc kubenswrapper[4911]: I1201 00:23:08.521285 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerDied","Data":"a21595dba10be44c65f5e20199b1da0324eeefd307a960c140c736ce16f11312"} Dec 01 00:23:08 crc kubenswrapper[4911]: I1201 00:23:08.556611 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-q54sn" podStartSLOduration=-9223371942.298193 podStartE2EDuration="1m34.556583473s" podCreationTimestamp="2025-12-01 00:21:34 +0000 UTC" firstStartedPulling="2025-12-01 00:21:35.050346073 +0000 UTC m=+855.189042844" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:23:08.550860223 +0000 UTC m=+948.689556994" watchObservedRunningTime="2025-12-01 00:23:08.556583473 +0000 UTC m=+948.695280244" Dec 01 00:23:12 crc kubenswrapper[4911]: I1201 00:23:12.550720 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"97613b36-7079-4bac-afc8-0c933bcb2d4d","Type":"ContainerStarted","Data":"da260b2b74f3c6f363727e51d60639cddaaf6a25d5563322aca4692a89134c00"} Dec 01 00:23:13 crc kubenswrapper[4911]: I1201 00:23:13.558227 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerStarted","Data":"71f5dc1960e195966d4ddf2da648b0be24aff3586f0023b76cb60605d5ef8767"} Dec 01 00:23:14 crc kubenswrapper[4911]: I1201 00:23:14.565416 4911 generic.go:334] "Generic (PLEG): container finished" podID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerID="71f5dc1960e195966d4ddf2da648b0be24aff3586f0023b76cb60605d5ef8767" exitCode=0 Dec 01 00:23:14 crc kubenswrapper[4911]: I1201 00:23:14.565601 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerDied","Data":"71f5dc1960e195966d4ddf2da648b0be24aff3586f0023b76cb60605d5ef8767"} Dec 01 00:23:14 crc kubenswrapper[4911]: I1201 00:23:14.570447 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerStarted","Data":"0c4aaa54b69527d2bb8949df5697e7f85ce250f32118598c5ec5a654e527de5c"} Dec 01 00:23:15 crc kubenswrapper[4911]: I1201 00:23:15.581754 4911 generic.go:334] "Generic (PLEG): container finished" podID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerID="da260b2b74f3c6f363727e51d60639cddaaf6a25d5563322aca4692a89134c00" exitCode=0 Dec 01 00:23:15 crc kubenswrapper[4911]: I1201 00:23:15.581875 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"97613b36-7079-4bac-afc8-0c933bcb2d4d","Type":"ContainerDied","Data":"da260b2b74f3c6f363727e51d60639cddaaf6a25d5563322aca4692a89134c00"} Dec 01 00:23:15 crc kubenswrapper[4911]: I1201 00:23:15.586299 4911 generic.go:334] "Generic (PLEG): container finished" podID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerID="0c4aaa54b69527d2bb8949df5697e7f85ce250f32118598c5ec5a654e527de5c" exitCode=0 Dec 01 00:23:15 crc kubenswrapper[4911]: I1201 00:23:15.586365 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerDied","Data":"0c4aaa54b69527d2bb8949df5697e7f85ce250f32118598c5ec5a654e527de5c"} Dec 01 00:23:16 crc kubenswrapper[4911]: I1201 00:23:16.597076 4911 generic.go:334] "Generic (PLEG): container finished" podID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerID="9b0b3c3ebcbd1e22f8a1496a0773f5f53b144ce07359c5e50c6d9adbdf2b2c1f" exitCode=0 Dec 01 00:23:16 crc kubenswrapper[4911]: I1201 00:23:16.597194 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"97613b36-7079-4bac-afc8-0c933bcb2d4d","Type":"ContainerDied","Data":"9b0b3c3ebcbd1e22f8a1496a0773f5f53b144ce07359c5e50c6d9adbdf2b2c1f"} Dec 01 00:23:16 crc kubenswrapper[4911]: I1201 00:23:16.600670 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerStarted","Data":"c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8"} Dec 01 00:23:16 crc kubenswrapper[4911]: I1201 00:23:16.671072 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5sw59" podStartSLOduration=18.551153717 podStartE2EDuration="27.671032892s" podCreationTimestamp="2025-12-01 00:22:49 +0000 UTC" firstStartedPulling="2025-12-01 00:23:06.509142028 +0000 UTC m=+946.647838829" lastFinishedPulling="2025-12-01 00:23:15.629021233 +0000 UTC m=+955.767718004" observedRunningTime="2025-12-01 00:23:16.667489523 +0000 UTC m=+956.806186294" watchObservedRunningTime="2025-12-01 00:23:16.671032892 +0000 UTC m=+956.809729713" Dec 01 00:23:17 crc kubenswrapper[4911]: I1201 00:23:17.607892 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"97613b36-7079-4bac-afc8-0c933bcb2d4d","Type":"ContainerStarted","Data":"d63ed1f6312d6320919d21ef09c0bcb4fecc4b6999257a78879776cea9736c60"} Dec 01 00:23:17 crc kubenswrapper[4911]: I1201 00:23:17.609282 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:23:17 crc kubenswrapper[4911]: I1201 00:23:17.612120 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerStarted","Data":"b37c1e3b64f8f3d7d46518c878196bded39a57b9c1482939ac4acb52f63bfc3a"} Dec 01 00:23:17 crc kubenswrapper[4911]: I1201 00:23:17.646561 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=6.742534679 podStartE2EDuration="2m57.646543663s" podCreationTimestamp="2025-12-01 00:20:20 +0000 UTC" firstStartedPulling="2025-12-01 00:20:20.972954816 +0000 UTC m=+781.111651587" lastFinishedPulling="2025-12-01 00:23:11.87696379 +0000 UTC m=+952.015660571" observedRunningTime="2025-12-01 00:23:17.64142274 +0000 UTC m=+957.780119531" watchObservedRunningTime="2025-12-01 00:23:17.646543663 +0000 UTC m=+957.785240454" Dec 01 00:23:17 crc kubenswrapper[4911]: I1201 00:23:17.665294 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwkb2" podStartSLOduration=5.274944818 podStartE2EDuration="13.665275676s" podCreationTimestamp="2025-12-01 00:23:04 +0000 UTC" firstStartedPulling="2025-12-01 00:23:08.52568237 +0000 UTC m=+948.664379151" lastFinishedPulling="2025-12-01 00:23:16.916013238 +0000 UTC m=+957.054710009" observedRunningTime="2025-12-01 00:23:17.661738388 +0000 UTC m=+957.800435159" watchObservedRunningTime="2025-12-01 00:23:17.665275676 +0000 UTC m=+957.803972447" Dec 01 00:23:19 crc kubenswrapper[4911]: I1201 00:23:19.600613 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:19 crc kubenswrapper[4911]: I1201 00:23:19.601014 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:19 crc kubenswrapper[4911]: I1201 00:23:19.664987 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:21 crc kubenswrapper[4911]: I1201 00:23:21.312086 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:23:21 crc kubenswrapper[4911]: I1201 00:23:21.312175 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:23:24 crc kubenswrapper[4911]: I1201 00:23:24.414610 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:24 crc kubenswrapper[4911]: I1201 00:23:24.415144 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:24 crc kubenswrapper[4911]: I1201 00:23:24.607652 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:25 crc kubenswrapper[4911]: I1201 00:23:25.247616 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:25 crc kubenswrapper[4911]: I1201 00:23:25.317569 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:27 crc kubenswrapper[4911]: I1201 00:23:27.193375 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwkb2" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="registry-server" containerID="cri-o://b37c1e3b64f8f3d7d46518c878196bded39a57b9c1482939ac4acb52f63bfc3a" gracePeriod=2 Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.205349 4911 generic.go:334] "Generic (PLEG): container finished" podID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerID="b37c1e3b64f8f3d7d46518c878196bded39a57b9c1482939ac4acb52f63bfc3a" exitCode=0 Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.205419 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerDied","Data":"b37c1e3b64f8f3d7d46518c878196bded39a57b9c1482939ac4acb52f63bfc3a"} Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.370745 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.503688 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content\") pod \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.503868 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k84r4\" (UniqueName: \"kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4\") pod \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.503984 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities\") pod \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\" (UID: \"57d84a9c-6e84-47ab-8e9c-8237c11e1420\") " Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.505129 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities" (OuterVolumeSpecName: "utilities") pod "57d84a9c-6e84-47ab-8e9c-8237c11e1420" (UID: "57d84a9c-6e84-47ab-8e9c-8237c11e1420"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.513191 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4" (OuterVolumeSpecName: "kube-api-access-k84r4") pod "57d84a9c-6e84-47ab-8e9c-8237c11e1420" (UID: "57d84a9c-6e84-47ab-8e9c-8237c11e1420"). InnerVolumeSpecName "kube-api-access-k84r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.556295 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57d84a9c-6e84-47ab-8e9c-8237c11e1420" (UID: "57d84a9c-6e84-47ab-8e9c-8237c11e1420"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.605990 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k84r4\" (UniqueName: \"kubernetes.io/projected/57d84a9c-6e84-47ab-8e9c-8237c11e1420-kube-api-access-k84r4\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.606056 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:28 crc kubenswrapper[4911]: I1201 00:23:28.606085 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d84a9c-6e84-47ab-8e9c-8237c11e1420-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.216882 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwkb2" event={"ID":"57d84a9c-6e84-47ab-8e9c-8237c11e1420","Type":"ContainerDied","Data":"273329e5f065b9e917bfebde7ba45ea3e32beb07aa647c8d3ddcaeea88486ef1"} Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.216942 4911 scope.go:117] "RemoveContainer" containerID="b37c1e3b64f8f3d7d46518c878196bded39a57b9c1482939ac4acb52f63bfc3a" Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.217078 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwkb2" Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.245364 4911 scope.go:117] "RemoveContainer" containerID="0c4aaa54b69527d2bb8949df5697e7f85ce250f32118598c5ec5a654e527de5c" Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.258441 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.263430 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwkb2"] Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.281499 4911 scope.go:117] "RemoveContainer" containerID="a21595dba10be44c65f5e20199b1da0324eeefd307a960c140c736ce16f11312" Dec 01 00:23:29 crc kubenswrapper[4911]: I1201 00:23:29.657787 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:30 crc kubenswrapper[4911]: I1201 00:23:30.169996 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" path="/var/lib/kubelet/pods/57d84a9c-6e84-47ab-8e9c-8237c11e1420/volumes" Dec 01 00:23:30 crc kubenswrapper[4911]: I1201 00:23:30.603496 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:23:30 crc kubenswrapper[4911]: I1201 00:23:30.605356 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5sw59" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="registry-server" containerID="cri-o://c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8" gracePeriod=2 Dec 01 00:23:30 crc kubenswrapper[4911]: I1201 00:23:30.871272 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerName="elasticsearch" probeResult="failure" output=< Dec 01 00:23:30 crc kubenswrapper[4911]: {"timestamp": "2025-12-01T00:23:30+00:00", "message": "readiness probe failed", "curl_rc": "7"} Dec 01 00:23:30 crc kubenswrapper[4911]: > Dec 01 00:23:33 crc kubenswrapper[4911]: E1201 00:23:33.821776 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97059c92_c1b1_41eb_8fcb_a28effa936f7.slice/crio-conmon-c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8.scope\": RecentStats: unable to find data in memory cache]" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.262860 4911 generic.go:334] "Generic (PLEG): container finished" podID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerID="c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8" exitCode=0 Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.262924 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerDied","Data":"c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8"} Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.742276 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.792517 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbgtp\" (UniqueName: \"kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp\") pod \"97059c92-c1b1-41eb-8fcb-a28effa936f7\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.792566 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content\") pod \"97059c92-c1b1-41eb-8fcb-a28effa936f7\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.792631 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities\") pod \"97059c92-c1b1-41eb-8fcb-a28effa936f7\" (UID: \"97059c92-c1b1-41eb-8fcb-a28effa936f7\") " Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.793739 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities" (OuterVolumeSpecName: "utilities") pod "97059c92-c1b1-41eb-8fcb-a28effa936f7" (UID: "97059c92-c1b1-41eb-8fcb-a28effa936f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.800912 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp" (OuterVolumeSpecName: "kube-api-access-lbgtp") pod "97059c92-c1b1-41eb-8fcb-a28effa936f7" (UID: "97059c92-c1b1-41eb-8fcb-a28effa936f7"). InnerVolumeSpecName "kube-api-access-lbgtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.850274 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97059c92-c1b1-41eb-8fcb-a28effa936f7" (UID: "97059c92-c1b1-41eb-8fcb-a28effa936f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.894244 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbgtp\" (UniqueName: \"kubernetes.io/projected/97059c92-c1b1-41eb-8fcb-a28effa936f7-kube-api-access-lbgtp\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.894295 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:34 crc kubenswrapper[4911]: I1201 00:23:34.894313 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97059c92-c1b1-41eb-8fcb-a28effa936f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.270969 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sw59" event={"ID":"97059c92-c1b1-41eb-8fcb-a28effa936f7","Type":"ContainerDied","Data":"7c25543709de68b73f804ff5add76008a98344b5ea449de94b5a7587dd6b1643"} Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.271030 4911 scope.go:117] "RemoveContainer" containerID="c0485abe06dce2320e531d7aa1e3f48f7ce61d69c59bd87373f44f33805df6e8" Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.271162 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sw59" Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.291755 4911 scope.go:117] "RemoveContainer" containerID="71f5dc1960e195966d4ddf2da648b0be24aff3586f0023b76cb60605d5ef8767" Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.305154 4911 scope.go:117] "RemoveContainer" containerID="4c206d27569a4da7e6efad335746e76bba2648a9073a4dbd147664cf4156efe5" Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.310573 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.324393 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5sw59"] Dec 01 00:23:35 crc kubenswrapper[4911]: I1201 00:23:35.811701 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerName="elasticsearch" probeResult="failure" output=< Dec 01 00:23:35 crc kubenswrapper[4911]: {"timestamp": "2025-12-01T00:23:35+00:00", "message": "readiness probe failed", "curl_rc": "7"} Dec 01 00:23:35 crc kubenswrapper[4911]: > Dec 01 00:23:36 crc kubenswrapper[4911]: I1201 00:23:36.159505 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" path="/var/lib/kubelet/pods/97059c92-c1b1-41eb-8fcb-a28effa936f7/volumes" Dec 01 00:23:40 crc kubenswrapper[4911]: I1201 00:23:40.797478 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerName="elasticsearch" probeResult="failure" output=< Dec 01 00:23:40 crc kubenswrapper[4911]: {"timestamp": "2025-12-01T00:23:40+00:00", "message": "readiness probe failed", "curl_rc": "7"} Dec 01 00:23:40 crc kubenswrapper[4911]: > Dec 01 00:23:45 crc kubenswrapper[4911]: I1201 00:23:45.799147 4911 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="97613b36-7079-4bac-afc8-0c933bcb2d4d" containerName="elasticsearch" probeResult="failure" output=< Dec 01 00:23:45 crc kubenswrapper[4911]: {"timestamp": "2025-12-01T00:23:45+00:00", "message": "readiness probe failed", "curl_rc": "7"} Dec 01 00:23:45 crc kubenswrapper[4911]: > Dec 01 00:23:50 crc kubenswrapper[4911]: I1201 00:23:50.975981 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Dec 01 00:23:51 crc kubenswrapper[4911]: I1201 00:23:51.311427 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:23:51 crc kubenswrapper[4911]: I1201 00:23:51.311502 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.311334 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.312194 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.312261 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.313127 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.313189 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3" gracePeriod=600 Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.841336 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3" exitCode=0 Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.841425 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3"} Dec 01 00:24:21 crc kubenswrapper[4911]: I1201 00:24:21.842066 4911 scope.go:117] "RemoveContainer" containerID="90b36241e5b9b053d99526d93d8d01cf61ef69de06fe015790f530836c79c9f7" Dec 01 00:24:22 crc kubenswrapper[4911]: I1201 00:24:22.852223 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6"} Dec 01 00:25:17 crc kubenswrapper[4911]: I1201 00:25:17.263839 4911 generic.go:334] "Generic (PLEG): container finished" podID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerID="54512a6cc8b4d5bfc4bce1df3c1dfd9bc56c34d1c08b8e9275875f9dcce22da9" exitCode=0 Dec 01 00:25:17 crc kubenswrapper[4911]: I1201 00:25:17.264124 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerDied","Data":"54512a6cc8b4d5bfc4bce1df3c1dfd9bc56c34d1c08b8e9275875f9dcce22da9"} Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.535150 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.707994 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.708055 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.708089 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.708307 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.708341 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709129 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vg7l\" (UniqueName: \"kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709167 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709205 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709220 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709218 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709242 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709321 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709379 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push\") pod \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\" (UID: \"f6a17336-11f8-44d3-a4f5-bffc3ce3becf\") " Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709717 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709758 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709772 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.709898 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.710536 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.711045 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.723612 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.723666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.723680 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l" (OuterVolumeSpecName: "kube-api-access-2vg7l") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "kube-api-access-2vg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.764186 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813163 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813202 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813212 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813222 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813231 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vg7l\" (UniqueName: \"kubernetes.io/projected/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-kube-api-access-2vg7l\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813239 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813247 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813255 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.813264 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.907560 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:18 crc kubenswrapper[4911]: I1201 00:25:18.914898 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:19 crc kubenswrapper[4911]: I1201 00:25:19.281208 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"f6a17336-11f8-44d3-a4f5-bffc3ce3becf","Type":"ContainerDied","Data":"94e7fd7c26441071f0ea66a61f0d6a953eadacc374e0ee2ab6215d901141d5a3"} Dec 01 00:25:19 crc kubenswrapper[4911]: I1201 00:25:19.281247 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e7fd7c26441071f0ea66a61f0d6a953eadacc374e0ee2ab6215d901141d5a3" Dec 01 00:25:19 crc kubenswrapper[4911]: I1201 00:25:19.281298 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Dec 01 00:25:21 crc kubenswrapper[4911]: I1201 00:25:21.031736 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f6a17336-11f8-44d3-a4f5-bffc3ce3becf" (UID: "f6a17336-11f8-44d3-a4f5-bffc3ce3becf"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:21 crc kubenswrapper[4911]: I1201 00:25:21.044125 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f6a17336-11f8-44d3-a4f5-bffc3ce3becf-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.906302 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907048 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="extract-content" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907066 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="extract-content" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907083 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907090 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907106 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="manage-dockerfile" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907114 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="manage-dockerfile" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907125 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="extract-content" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907132 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="extract-content" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907144 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="extract-utilities" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907151 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="extract-utilities" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907161 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="extract-utilities" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907169 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="extract-utilities" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907179 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907188 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907200 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="git-clone" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907207 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="git-clone" Dec 01 00:25:23 crc kubenswrapper[4911]: E1201 00:25:23.907217 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="docker-build" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907224 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="docker-build" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907374 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="97059c92-c1b1-41eb-8fcb-a28effa936f7" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907391 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d84a9c-6e84-47ab-8e9c-8237c11e1420" containerName="registry-server" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.907402 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a17336-11f8-44d3-a4f5-bffc3ce3becf" containerName="docker-build" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.908208 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.910527 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.910748 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.910542 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.910594 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Dec 01 00:25:23 crc kubenswrapper[4911]: I1201 00:25:23.935184 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081010 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081126 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081173 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081228 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081258 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rct65\" (UniqueName: \"kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081297 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081326 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081367 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081401 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081524 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081721 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.081772 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182711 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182820 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182877 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182905 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182946 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.182989 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rct65\" (UniqueName: \"kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183041 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183082 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183143 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183192 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183507 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183197 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183718 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.183963 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.184799 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.184917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.184973 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.185062 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.185075 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.185549 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.185767 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.190963 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.191205 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.202586 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rct65\" (UniqueName: \"kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65\") pod \"smart-gateway-operator-1-build\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.224262 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:24 crc kubenswrapper[4911]: I1201 00:25:24.462451 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:25 crc kubenswrapper[4911]: I1201 00:25:25.340784 4911 generic.go:334] "Generic (PLEG): container finished" podID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerID="af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1" exitCode=0 Dec 01 00:25:25 crc kubenswrapper[4911]: I1201 00:25:25.341004 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9","Type":"ContainerDied","Data":"af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1"} Dec 01 00:25:25 crc kubenswrapper[4911]: I1201 00:25:25.341180 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9","Type":"ContainerStarted","Data":"aa1ec1a33d116e31fc0e2a9e5edd66ac005df406a45637624cd9ea7cce94166a"} Dec 01 00:25:26 crc kubenswrapper[4911]: I1201 00:25:26.350097 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9","Type":"ContainerStarted","Data":"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05"} Dec 01 00:25:26 crc kubenswrapper[4911]: I1201 00:25:26.386157 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.386135745 podStartE2EDuration="3.386135745s" podCreationTimestamp="2025-12-01 00:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:25:26.383279755 +0000 UTC m=+1086.521976586" watchObservedRunningTime="2025-12-01 00:25:26.386135745 +0000 UTC m=+1086.524832526" Dec 01 00:25:34 crc kubenswrapper[4911]: I1201 00:25:34.923113 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:34 crc kubenswrapper[4911]: I1201 00:25:34.924690 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="docker-build" containerID="cri-o://6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05" gracePeriod=30 Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.487169 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.489498 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.491288 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.491814 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.492211 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.517002 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573216 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573297 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573320 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573340 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573392 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573451 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573530 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573601 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573662 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573714 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573737 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlfb\" (UniqueName: \"kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.573801 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675320 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675405 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675438 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675485 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675517 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675548 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675573 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675598 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675624 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675642 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675659 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlfb\" (UniqueName: \"kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675712 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675791 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.676014 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.675560 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.676692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.676783 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.677050 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.677253 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.677366 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.677402 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.682261 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.684673 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.697225 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlfb\" (UniqueName: \"kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb\") pod \"smart-gateway-operator-2-build\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:36 crc kubenswrapper[4911]: I1201 00:25:36.805057 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.056560 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.130965 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e40a63e6-21ef-4fe1-a0e4-d258457ae1f9/docker-build/0.log" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.131559 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.283968 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284011 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284047 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284078 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284102 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284126 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284197 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284226 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rct65\" (UniqueName: \"kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284258 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284279 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.284298 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache\") pod \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\" (UID: \"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9\") " Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285105 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285143 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285153 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285571 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285565 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285801 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.285934 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.290027 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65" (OuterVolumeSpecName: "kube-api-access-rct65") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "kube-api-access-rct65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.290302 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.292480 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.385911 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386068 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386150 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386379 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386486 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386567 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386654 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386754 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rct65\" (UniqueName: \"kubernetes.io/projected/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-kube-api-access-rct65\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386859 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.386969 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.454828 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e40a63e6-21ef-4fe1-a0e4-d258457ae1f9/docker-build/0.log" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.455747 4911 generic.go:334] "Generic (PLEG): container finished" podID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerID="6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05" exitCode=1 Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.455819 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.455852 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9","Type":"ContainerDied","Data":"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05"} Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.457309 4911 scope.go:117] "RemoveContainer" containerID="6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.457248 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e40a63e6-21ef-4fe1-a0e4-d258457ae1f9","Type":"ContainerDied","Data":"aa1ec1a33d116e31fc0e2a9e5edd66ac005df406a45637624cd9ea7cce94166a"} Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.460345 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerStarted","Data":"9e23e03a4ff50e31fe1ec4503d756dd7e1d82f80f1f3e28067e653b6d54646c1"} Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.705126 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:37 crc kubenswrapper[4911]: I1201 00:25:37.792075 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:38 crc kubenswrapper[4911]: I1201 00:25:38.467200 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerStarted","Data":"3f2c63cde2ee8c5cb7395993eb25cb5d46b68908ad5ee56af0615caf9605b15c"} Dec 01 00:25:38 crc kubenswrapper[4911]: I1201 00:25:38.986265 4911 scope.go:117] "RemoveContainer" containerID="af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1" Dec 01 00:25:39 crc kubenswrapper[4911]: I1201 00:25:39.018497 4911 scope.go:117] "RemoveContainer" containerID="6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05" Dec 01 00:25:39 crc kubenswrapper[4911]: E1201 00:25:39.020314 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05\": container with ID starting with 6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05 not found: ID does not exist" containerID="6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05" Dec 01 00:25:39 crc kubenswrapper[4911]: I1201 00:25:39.020374 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05"} err="failed to get container status \"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05\": rpc error: code = NotFound desc = could not find container \"6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05\": container with ID starting with 6103ca9ebc55ed3bb963a1882c7077e0ea78ab2517083b9c831db2cf5193ca05 not found: ID does not exist" Dec 01 00:25:39 crc kubenswrapper[4911]: I1201 00:25:39.020407 4911 scope.go:117] "RemoveContainer" containerID="af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1" Dec 01 00:25:39 crc kubenswrapper[4911]: E1201 00:25:39.023411 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1\": container with ID starting with af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1 not found: ID does not exist" containerID="af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1" Dec 01 00:25:39 crc kubenswrapper[4911]: I1201 00:25:39.023489 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1"} err="failed to get container status \"af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1\": rpc error: code = NotFound desc = could not find container \"af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1\": container with ID starting with af0637110dc32884f99f6af1c77ea2dda8d56a1fd163e3f32c22e7c1b65117c1 not found: ID does not exist" Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.488037 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerID="3f2c63cde2ee8c5cb7395993eb25cb5d46b68908ad5ee56af0615caf9605b15c" exitCode=0 Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.488093 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerDied","Data":"3f2c63cde2ee8c5cb7395993eb25cb5d46b68908ad5ee56af0615caf9605b15c"} Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.944525 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" (UID: "e40a63e6-21ef-4fe1-a0e4-d258457ae1f9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.960160 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.989149 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:41 crc kubenswrapper[4911]: I1201 00:25:41.996091 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Dec 01 00:25:42 crc kubenswrapper[4911]: I1201 00:25:42.167218 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" path="/var/lib/kubelet/pods/e40a63e6-21ef-4fe1-a0e4-d258457ae1f9/volumes" Dec 01 00:25:42 crc kubenswrapper[4911]: I1201 00:25:42.497168 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerID="fea0308793f50e2cf901adde41c4e8eacf6f1774bb45574c802d718be51e00cf" exitCode=0 Dec 01 00:25:42 crc kubenswrapper[4911]: I1201 00:25:42.497208 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerDied","Data":"fea0308793f50e2cf901adde41c4e8eacf6f1774bb45574c802d718be51e00cf"} Dec 01 00:25:42 crc kubenswrapper[4911]: I1201 00:25:42.538842 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_e1033824-d6b5-4d20-b20d-69b95563aaf1/manage-dockerfile/0.log" Dec 01 00:25:43 crc kubenswrapper[4911]: I1201 00:25:43.510021 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerStarted","Data":"c7e22f162f94279c0f7fb02ec297fe1f48ec3b2b672028bb055dedc432430de7"} Dec 01 00:25:43 crc kubenswrapper[4911]: I1201 00:25:43.554307 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=7.554266818 podStartE2EDuration="7.554266818s" podCreationTimestamp="2025-12-01 00:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:25:43.538569779 +0000 UTC m=+1103.677266560" watchObservedRunningTime="2025-12-01 00:25:43.554266818 +0000 UTC m=+1103.692963649" Dec 01 00:26:51 crc kubenswrapper[4911]: I1201 00:26:51.312068 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:26:51 crc kubenswrapper[4911]: I1201 00:26:51.312769 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:27:18 crc kubenswrapper[4911]: I1201 00:27:18.258922 4911 generic.go:334] "Generic (PLEG): container finished" podID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerID="c7e22f162f94279c0f7fb02ec297fe1f48ec3b2b672028bb055dedc432430de7" exitCode=0 Dec 01 00:27:18 crc kubenswrapper[4911]: I1201 00:27:18.258976 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerDied","Data":"c7e22f162f94279c0f7fb02ec297fe1f48ec3b2b672028bb055dedc432430de7"} Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.587957 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703252 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703323 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703388 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703441 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlfb\" (UniqueName: \"kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703511 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703526 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703573 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703612 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703637 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703669 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703692 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.703713 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets\") pod \"e1033824-d6b5-4d20-b20d-69b95563aaf1\" (UID: \"e1033824-d6b5-4d20-b20d-69b95563aaf1\") " Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.704019 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.704171 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.704708 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.704755 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.704828 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.706256 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.712025 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb" (OuterVolumeSpecName: "kube-api-access-wqlfb") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "kube-api-access-wqlfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.712570 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.715785 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.716504 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805790 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805840 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805860 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e1033824-d6b5-4d20-b20d-69b95563aaf1-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805878 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805895 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e1033824-d6b5-4d20-b20d-69b95563aaf1-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805911 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805928 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.805945 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlfb\" (UniqueName: \"kubernetes.io/projected/e1033824-d6b5-4d20-b20d-69b95563aaf1-kube-api-access-wqlfb\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.806033 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.806053 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.903906 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:19 crc kubenswrapper[4911]: I1201 00:27:19.907736 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:20 crc kubenswrapper[4911]: I1201 00:27:20.276167 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"e1033824-d6b5-4d20-b20d-69b95563aaf1","Type":"ContainerDied","Data":"9e23e03a4ff50e31fe1ec4503d756dd7e1d82f80f1f3e28067e653b6d54646c1"} Dec 01 00:27:20 crc kubenswrapper[4911]: I1201 00:27:20.276206 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e23e03a4ff50e31fe1ec4503d756dd7e1d82f80f1f3e28067e653b6d54646c1" Dec 01 00:27:20 crc kubenswrapper[4911]: I1201 00:27:20.276304 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Dec 01 00:27:21 crc kubenswrapper[4911]: I1201 00:27:21.312756 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:27:21 crc kubenswrapper[4911]: I1201 00:27:21.313088 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:27:22 crc kubenswrapper[4911]: I1201 00:27:22.229022 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e1033824-d6b5-4d20-b20d-69b95563aaf1" (UID: "e1033824-d6b5-4d20-b20d-69b95563aaf1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:22 crc kubenswrapper[4911]: I1201 00:27:22.238346 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e1033824-d6b5-4d20-b20d-69b95563aaf1-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.315917 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:24 crc kubenswrapper[4911]: E1201 00:27:24.316183 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316198 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: E1201 00:27:24.316211 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="git-clone" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316220 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="git-clone" Dec 01 00:27:24 crc kubenswrapper[4911]: E1201 00:27:24.316237 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316245 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: E1201 00:27:24.316256 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="manage-dockerfile" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316263 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="manage-dockerfile" Dec 01 00:27:24 crc kubenswrapper[4911]: E1201 00:27:24.316274 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="manage-dockerfile" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316281 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="manage-dockerfile" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316398 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1033824-d6b5-4d20-b20d-69b95563aaf1" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.316411 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40a63e6-21ef-4fe1-a0e4-d258457ae1f9" containerName="docker-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.317136 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.322705 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.322991 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.323179 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.323385 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.329699 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.467432 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468171 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468305 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468435 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468595 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468708 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468822 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.468933 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.469052 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.469185 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.469298 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmvd\" (UniqueName: \"kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.469402 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571194 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571239 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571258 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571274 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571298 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571369 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmvd\" (UniqueName: \"kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571435 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571463 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571504 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571533 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.571599 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.572176 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.572380 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.572429 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.572463 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.573048 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.572234 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.573383 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.574549 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.578684 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.578803 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.592596 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmvd\" (UniqueName: \"kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd\") pod \"sg-core-1-build\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.636981 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Dec 01 00:27:24 crc kubenswrapper[4911]: I1201 00:27:24.834675 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:25 crc kubenswrapper[4911]: I1201 00:27:25.315738 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"a474e341-3dc2-491e-829d-2e47255e673a","Type":"ContainerStarted","Data":"0dda127ba5963a153cd58d24ea209c630af8e310ce0f67f3afcd51b20b8db6eb"} Dec 01 00:27:26 crc kubenswrapper[4911]: I1201 00:27:26.326772 4911 generic.go:334] "Generic (PLEG): container finished" podID="a474e341-3dc2-491e-829d-2e47255e673a" containerID="c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6" exitCode=0 Dec 01 00:27:26 crc kubenswrapper[4911]: I1201 00:27:26.326841 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"a474e341-3dc2-491e-829d-2e47255e673a","Type":"ContainerDied","Data":"c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6"} Dec 01 00:27:27 crc kubenswrapper[4911]: I1201 00:27:27.336380 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"a474e341-3dc2-491e-829d-2e47255e673a","Type":"ContainerStarted","Data":"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b"} Dec 01 00:27:27 crc kubenswrapper[4911]: I1201 00:27:27.362970 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.36294819 podStartE2EDuration="3.36294819s" podCreationTimestamp="2025-12-01 00:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:27:27.360417929 +0000 UTC m=+1207.499114720" watchObservedRunningTime="2025-12-01 00:27:27.36294819 +0000 UTC m=+1207.501644961" Dec 01 00:27:34 crc kubenswrapper[4911]: I1201 00:27:34.735434 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:34 crc kubenswrapper[4911]: I1201 00:27:34.736298 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="docker-build" containerID="cri-o://4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b" gracePeriod=30 Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.168632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_a474e341-3dc2-491e-829d-2e47255e673a/docker-build/0.log" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.169415 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.312873 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313075 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313130 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313149 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313359 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmvd\" (UniqueName: \"kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313454 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313529 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313565 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313616 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313642 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313662 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313695 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313718 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles\") pod \"a474e341-3dc2-491e-829d-2e47255e673a\" (UID: \"a474e341-3dc2-491e-829d-2e47255e673a\") " Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.313943 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.314182 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.314299 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.314485 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.314617 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.314805 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.315425 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.315952 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.319874 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.321274 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd" (OuterVolumeSpecName: "kube-api-access-7zmvd") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "kube-api-access-7zmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.326751 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.405975 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_a474e341-3dc2-491e-829d-2e47255e673a/docker-build/0.log" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.406397 4911 generic.go:334] "Generic (PLEG): container finished" podID="a474e341-3dc2-491e-829d-2e47255e673a" containerID="4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b" exitCode=1 Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.406444 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"a474e341-3dc2-491e-829d-2e47255e673a","Type":"ContainerDied","Data":"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b"} Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.406499 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"a474e341-3dc2-491e-829d-2e47255e673a","Type":"ContainerDied","Data":"0dda127ba5963a153cd58d24ea209c630af8e310ce0f67f3afcd51b20b8db6eb"} Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.406521 4911 scope.go:117] "RemoveContainer" containerID="4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.406556 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.410079 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416541 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmvd\" (UniqueName: \"kubernetes.io/projected/a474e341-3dc2-491e-829d-2e47255e673a-kube-api-access-7zmvd\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416570 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416584 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a474e341-3dc2-491e-829d-2e47255e673a-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416597 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416608 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416620 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/a474e341-3dc2-491e-829d-2e47255e673a-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416631 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.416642 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a474e341-3dc2-491e-829d-2e47255e673a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.459919 4911 scope.go:117] "RemoveContainer" containerID="c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.486337 4911 scope.go:117] "RemoveContainer" containerID="4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b" Dec 01 00:27:35 crc kubenswrapper[4911]: E1201 00:27:35.487104 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b\": container with ID starting with 4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b not found: ID does not exist" containerID="4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.487370 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b"} err="failed to get container status \"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b\": rpc error: code = NotFound desc = could not find container \"4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b\": container with ID starting with 4b9ea59c97514ae7b9b85b48b82ec480ab5650d5e898ee3ca2b970b583bae43b not found: ID does not exist" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.487411 4911 scope.go:117] "RemoveContainer" containerID="c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6" Dec 01 00:27:35 crc kubenswrapper[4911]: E1201 00:27:35.488419 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6\": container with ID starting with c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6 not found: ID does not exist" containerID="c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.488488 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6"} err="failed to get container status \"c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6\": rpc error: code = NotFound desc = could not find container \"c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6\": container with ID starting with c8b3f0bdbaed14605559c5aac20f54f9a5b376a181072ba9e89c11834a98a0c6 not found: ID does not exist" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.531962 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a474e341-3dc2-491e-829d-2e47255e673a" (UID: "a474e341-3dc2-491e-829d-2e47255e673a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.619494 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a474e341-3dc2-491e-829d-2e47255e673a-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.775146 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:35 crc kubenswrapper[4911]: I1201 00:27:35.785439 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.170752 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a474e341-3dc2-491e-829d-2e47255e673a" path="/var/lib/kubelet/pods/a474e341-3dc2-491e-829d-2e47255e673a/volumes" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.483499 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Dec 01 00:27:36 crc kubenswrapper[4911]: E1201 00:27:36.484144 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="docker-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.484158 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="docker-build" Dec 01 00:27:36 crc kubenswrapper[4911]: E1201 00:27:36.484173 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="manage-dockerfile" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.484180 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="manage-dockerfile" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.484278 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474e341-3dc2-491e-829d-2e47255e673a" containerName="docker-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.485082 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.487390 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.487695 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.488268 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.488641 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.506885 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.633426 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.633746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.633853 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.633942 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634013 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634117 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634230 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzl4\" (UniqueName: \"kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634312 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634431 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634572 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634686 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.634821 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736284 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736349 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736384 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736429 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736487 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736553 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736637 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzl4\" (UniqueName: \"kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736668 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736701 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736755 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736801 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736802 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.736835 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737140 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737187 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737234 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737251 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737526 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737705 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.737901 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.738873 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.740075 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.740084 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.762270 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzl4\" (UniqueName: \"kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4\") pod \"sg-core-2-build\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " pod="service-telemetry/sg-core-2-build" Dec 01 00:27:36 crc kubenswrapper[4911]: I1201 00:27:36.810582 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Dec 01 00:27:37 crc kubenswrapper[4911]: I1201 00:27:37.016654 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Dec 01 00:27:37 crc kubenswrapper[4911]: I1201 00:27:37.425657 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerStarted","Data":"b4f42e77a2be4aa2257f45e9b84883f6d90827cd44ee2507a112b76a6e2e651e"} Dec 01 00:27:37 crc kubenswrapper[4911]: I1201 00:27:37.425738 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerStarted","Data":"0289f5f94f1e5f0703962a7c2abcbd7c8d114725c0d4160d143929c5d8f89609"} Dec 01 00:27:37 crc kubenswrapper[4911]: E1201 00:27:37.827702 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fbb3a5c_bba4_497d_a28e_e6e7d3d2244f.slice/crio-b4f42e77a2be4aa2257f45e9b84883f6d90827cd44ee2507a112b76a6e2e651e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fbb3a5c_bba4_497d_a28e_e6e7d3d2244f.slice/crio-conmon-b4f42e77a2be4aa2257f45e9b84883f6d90827cd44ee2507a112b76a6e2e651e.scope\": RecentStats: unable to find data in memory cache]" Dec 01 00:27:38 crc kubenswrapper[4911]: I1201 00:27:38.676411 4911 generic.go:334] "Generic (PLEG): container finished" podID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerID="b4f42e77a2be4aa2257f45e9b84883f6d90827cd44ee2507a112b76a6e2e651e" exitCode=0 Dec 01 00:27:38 crc kubenswrapper[4911]: I1201 00:27:38.676839 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerDied","Data":"b4f42e77a2be4aa2257f45e9b84883f6d90827cd44ee2507a112b76a6e2e651e"} Dec 01 00:27:39 crc kubenswrapper[4911]: I1201 00:27:39.967237 4911 generic.go:334] "Generic (PLEG): container finished" podID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerID="abd363216b4b67446668d5dec76860539f4e538b28059d813a43d437dc095c02" exitCode=0 Dec 01 00:27:39 crc kubenswrapper[4911]: I1201 00:27:39.967278 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerDied","Data":"abd363216b4b67446668d5dec76860539f4e538b28059d813a43d437dc095c02"} Dec 01 00:27:39 crc kubenswrapper[4911]: I1201 00:27:39.997805 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f/manage-dockerfile/0.log" Dec 01 00:27:41 crc kubenswrapper[4911]: I1201 00:27:41.035666 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerStarted","Data":"1f5c09adb8a60323da75470d716caf27c87cfc2eeb7a5880a68ad4cf5d42e20b"} Dec 01 00:27:41 crc kubenswrapper[4911]: I1201 00:27:41.070301 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.070276585 podStartE2EDuration="5.070276585s" podCreationTimestamp="2025-12-01 00:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:27:41.059356497 +0000 UTC m=+1221.198053278" watchObservedRunningTime="2025-12-01 00:27:41.070276585 +0000 UTC m=+1221.208973366" Dec 01 00:27:45 crc kubenswrapper[4911]: I1201 00:27:45.109827 4911 scope.go:117] "RemoveContainer" containerID="6248ece68b58f4766c8ebcb162098d955728ae8683705105aaa74ad145403186" Dec 01 00:27:51 crc kubenswrapper[4911]: I1201 00:27:51.312099 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:27:51 crc kubenswrapper[4911]: I1201 00:27:51.312759 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:27:51 crc kubenswrapper[4911]: I1201 00:27:51.312801 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:27:51 crc kubenswrapper[4911]: I1201 00:27:51.313250 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:27:51 crc kubenswrapper[4911]: I1201 00:27:51.313296 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6" gracePeriod=600 Dec 01 00:27:52 crc kubenswrapper[4911]: I1201 00:27:52.707909 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6" exitCode=0 Dec 01 00:27:52 crc kubenswrapper[4911]: I1201 00:27:52.708533 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6"} Dec 01 00:27:52 crc kubenswrapper[4911]: I1201 00:27:52.710983 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd"} Dec 01 00:27:52 crc kubenswrapper[4911]: I1201 00:27:52.711019 4911 scope.go:117] "RemoveContainer" containerID="5b471926ec3d05582c4ed624725570182663e3031685169997a11e92aa05c8b3" Dec 01 00:29:51 crc kubenswrapper[4911]: I1201 00:29:51.311454 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:29:51 crc kubenswrapper[4911]: I1201 00:29:51.312257 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.218717 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg"] Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.220199 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.222652 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.222743 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.229284 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg"] Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.296508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.296928 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqvsj\" (UniqueName: \"kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.297092 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.398535 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.399088 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqvsj\" (UniqueName: \"kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.399115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.400143 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.405384 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.416270 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqvsj\" (UniqueName: \"kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj\") pod \"collect-profiles-29409150-jxcpg\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.539108 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:00 crc kubenswrapper[4911]: I1201 00:30:00.777238 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg"] Dec 01 00:30:01 crc kubenswrapper[4911]: I1201 00:30:01.019775 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" event={"ID":"77a448f0-4680-4a7b-b82a-51f8f4b45613","Type":"ContainerStarted","Data":"5a1cbe4f664f2d3cb4a5693801ca03702d42cb83f38b4889193b24f12c458ece"} Dec 01 00:30:01 crc kubenswrapper[4911]: I1201 00:30:01.019825 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" event={"ID":"77a448f0-4680-4a7b-b82a-51f8f4b45613","Type":"ContainerStarted","Data":"840220bb93491f425c1b0a17bea5ac3fb565261ac6fb7d4667ef675ab2a3e501"} Dec 01 00:30:01 crc kubenswrapper[4911]: I1201 00:30:01.047392 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" podStartSLOduration=1.047374905 podStartE2EDuration="1.047374905s" podCreationTimestamp="2025-12-01 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:30:01.044083222 +0000 UTC m=+1361.182779993" watchObservedRunningTime="2025-12-01 00:30:01.047374905 +0000 UTC m=+1361.186071676" Dec 01 00:30:02 crc kubenswrapper[4911]: I1201 00:30:02.030975 4911 generic.go:334] "Generic (PLEG): container finished" podID="77a448f0-4680-4a7b-b82a-51f8f4b45613" containerID="5a1cbe4f664f2d3cb4a5693801ca03702d42cb83f38b4889193b24f12c458ece" exitCode=0 Dec 01 00:30:02 crc kubenswrapper[4911]: I1201 00:30:02.032438 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" event={"ID":"77a448f0-4680-4a7b-b82a-51f8f4b45613","Type":"ContainerDied","Data":"5a1cbe4f664f2d3cb4a5693801ca03702d42cb83f38b4889193b24f12c458ece"} Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.262837 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.432480 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqvsj\" (UniqueName: \"kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj\") pod \"77a448f0-4680-4a7b-b82a-51f8f4b45613\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.432565 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume\") pod \"77a448f0-4680-4a7b-b82a-51f8f4b45613\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.432611 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume\") pod \"77a448f0-4680-4a7b-b82a-51f8f4b45613\" (UID: \"77a448f0-4680-4a7b-b82a-51f8f4b45613\") " Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.433083 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume" (OuterVolumeSpecName: "config-volume") pod "77a448f0-4680-4a7b-b82a-51f8f4b45613" (UID: "77a448f0-4680-4a7b-b82a-51f8f4b45613"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.438170 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77a448f0-4680-4a7b-b82a-51f8f4b45613" (UID: "77a448f0-4680-4a7b-b82a-51f8f4b45613"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.438567 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj" (OuterVolumeSpecName: "kube-api-access-gqvsj") pod "77a448f0-4680-4a7b-b82a-51f8f4b45613" (UID: "77a448f0-4680-4a7b-b82a-51f8f4b45613"). InnerVolumeSpecName "kube-api-access-gqvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.534604 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a448f0-4680-4a7b-b82a-51f8f4b45613-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.534645 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqvsj\" (UniqueName: \"kubernetes.io/projected/77a448f0-4680-4a7b-b82a-51f8f4b45613-kube-api-access-gqvsj\") on node \"crc\" DevicePath \"\"" Dec 01 00:30:03 crc kubenswrapper[4911]: I1201 00:30:03.534654 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a448f0-4680-4a7b-b82a-51f8f4b45613-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:30:04 crc kubenswrapper[4911]: I1201 00:30:04.047325 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" event={"ID":"77a448f0-4680-4a7b-b82a-51f8f4b45613","Type":"ContainerDied","Data":"840220bb93491f425c1b0a17bea5ac3fb565261ac6fb7d4667ef675ab2a3e501"} Dec 01 00:30:04 crc kubenswrapper[4911]: I1201 00:30:04.047374 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="840220bb93491f425c1b0a17bea5ac3fb565261ac6fb7d4667ef675ab2a3e501" Dec 01 00:30:04 crc kubenswrapper[4911]: I1201 00:30:04.047835 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409150-jxcpg" Dec 01 00:30:21 crc kubenswrapper[4911]: I1201 00:30:21.311525 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:30:21 crc kubenswrapper[4911]: I1201 00:30:21.312022 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:30:51 crc kubenswrapper[4911]: I1201 00:30:51.312103 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:30:51 crc kubenswrapper[4911]: I1201 00:30:51.313753 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:30:51 crc kubenswrapper[4911]: I1201 00:30:51.313851 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:30:51 crc kubenswrapper[4911]: I1201 00:30:51.314864 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:30:51 crc kubenswrapper[4911]: I1201 00:30:51.314995 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd" gracePeriod=600 Dec 01 00:30:52 crc kubenswrapper[4911]: I1201 00:30:52.401327 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd" exitCode=0 Dec 01 00:30:52 crc kubenswrapper[4911]: I1201 00:30:52.401731 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd"} Dec 01 00:30:52 crc kubenswrapper[4911]: I1201 00:30:52.401765 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097"} Dec 01 00:30:52 crc kubenswrapper[4911]: I1201 00:30:52.401783 4911 scope.go:117] "RemoveContainer" containerID="3617b2a113e3ef73df7ff22f074cb98d7742d918e195bf015afe52cfb194d2c6" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.034945 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:00 crc kubenswrapper[4911]: E1201 00:31:00.035898 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a448f0-4680-4a7b-b82a-51f8f4b45613" containerName="collect-profiles" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.035915 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a448f0-4680-4a7b-b82a-51f8f4b45613" containerName="collect-profiles" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.036074 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a448f0-4680-4a7b-b82a-51f8f4b45613" containerName="collect-profiles" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.036982 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.054796 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.238782 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mm44\" (UniqueName: \"kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.238842 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.238878 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.340938 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mm44\" (UniqueName: \"kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.341019 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.341061 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.341762 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.342691 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.372252 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mm44\" (UniqueName: \"kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44\") pod \"redhat-operators-xc257\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.439806 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:00 crc kubenswrapper[4911]: I1201 00:31:00.694047 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:01 crc kubenswrapper[4911]: I1201 00:31:01.465289 4911 generic.go:334] "Generic (PLEG): container finished" podID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerID="22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062" exitCode=0 Dec 01 00:31:01 crc kubenswrapper[4911]: I1201 00:31:01.465404 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerDied","Data":"22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062"} Dec 01 00:31:01 crc kubenswrapper[4911]: I1201 00:31:01.465675 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerStarted","Data":"411b3f75ba0dd1b92e4c485d92b67ed705bb0d7ba4e120d179578c0bcdbda1ea"} Dec 01 00:31:01 crc kubenswrapper[4911]: I1201 00:31:01.468612 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 00:31:02 crc kubenswrapper[4911]: I1201 00:31:02.474829 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerStarted","Data":"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778"} Dec 01 00:31:03 crc kubenswrapper[4911]: I1201 00:31:03.484041 4911 generic.go:334] "Generic (PLEG): container finished" podID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerID="719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778" exitCode=0 Dec 01 00:31:03 crc kubenswrapper[4911]: I1201 00:31:03.484175 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerDied","Data":"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778"} Dec 01 00:31:04 crc kubenswrapper[4911]: I1201 00:31:04.494794 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerStarted","Data":"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51"} Dec 01 00:31:04 crc kubenswrapper[4911]: I1201 00:31:04.520990 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xc257" podStartSLOduration=1.934986904 podStartE2EDuration="4.520948045s" podCreationTimestamp="2025-12-01 00:31:00 +0000 UTC" firstStartedPulling="2025-12-01 00:31:01.468359688 +0000 UTC m=+1421.607056459" lastFinishedPulling="2025-12-01 00:31:04.054320829 +0000 UTC m=+1424.193017600" observedRunningTime="2025-12-01 00:31:04.515700577 +0000 UTC m=+1424.654397348" watchObservedRunningTime="2025-12-01 00:31:04.520948045 +0000 UTC m=+1424.659644826" Dec 01 00:31:10 crc kubenswrapper[4911]: I1201 00:31:10.440775 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:10 crc kubenswrapper[4911]: I1201 00:31:10.441260 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:10 crc kubenswrapper[4911]: I1201 00:31:10.481836 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:10 crc kubenswrapper[4911]: I1201 00:31:10.563172 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:10 crc kubenswrapper[4911]: I1201 00:31:10.721585 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:12 crc kubenswrapper[4911]: I1201 00:31:12.541962 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xc257" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="registry-server" containerID="cri-o://b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51" gracePeriod=2 Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.280276 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.438149 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mm44\" (UniqueName: \"kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44\") pod \"86814268-6c92-4818-a3dc-63aeec84bdb6\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.438282 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content\") pod \"86814268-6c92-4818-a3dc-63aeec84bdb6\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.438384 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities\") pod \"86814268-6c92-4818-a3dc-63aeec84bdb6\" (UID: \"86814268-6c92-4818-a3dc-63aeec84bdb6\") " Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.439255 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities" (OuterVolumeSpecName: "utilities") pod "86814268-6c92-4818-a3dc-63aeec84bdb6" (UID: "86814268-6c92-4818-a3dc-63aeec84bdb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.445819 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44" (OuterVolumeSpecName: "kube-api-access-6mm44") pod "86814268-6c92-4818-a3dc-63aeec84bdb6" (UID: "86814268-6c92-4818-a3dc-63aeec84bdb6"). InnerVolumeSpecName "kube-api-access-6mm44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.540873 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mm44\" (UniqueName: \"kubernetes.io/projected/86814268-6c92-4818-a3dc-63aeec84bdb6-kube-api-access-6mm44\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.540915 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558255 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86814268-6c92-4818-a3dc-63aeec84bdb6" (UID: "86814268-6c92-4818-a3dc-63aeec84bdb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558616 4911 generic.go:334] "Generic (PLEG): container finished" podID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerID="b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51" exitCode=0 Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558674 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerDied","Data":"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51"} Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558715 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc257" event={"ID":"86814268-6c92-4818-a3dc-63aeec84bdb6","Type":"ContainerDied","Data":"411b3f75ba0dd1b92e4c485d92b67ed705bb0d7ba4e120d179578c0bcdbda1ea"} Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558747 4911 scope.go:117] "RemoveContainer" containerID="b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.558921 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc257" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.593944 4911 scope.go:117] "RemoveContainer" containerID="719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.614131 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.619752 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xc257"] Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.627297 4911 scope.go:117] "RemoveContainer" containerID="22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.642018 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86814268-6c92-4818-a3dc-63aeec84bdb6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.643615 4911 scope.go:117] "RemoveContainer" containerID="b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51" Dec 01 00:31:14 crc kubenswrapper[4911]: E1201 00:31:14.644422 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51\": container with ID starting with b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51 not found: ID does not exist" containerID="b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.644464 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51"} err="failed to get container status \"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51\": rpc error: code = NotFound desc = could not find container \"b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51\": container with ID starting with b2f838cd454fc35961e9c593504ed66b38b26dcb412a9ac30ca671ae82a22f51 not found: ID does not exist" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.644503 4911 scope.go:117] "RemoveContainer" containerID="719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778" Dec 01 00:31:14 crc kubenswrapper[4911]: E1201 00:31:14.644973 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778\": container with ID starting with 719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778 not found: ID does not exist" containerID="719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.645011 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778"} err="failed to get container status \"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778\": rpc error: code = NotFound desc = could not find container \"719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778\": container with ID starting with 719a5fe9f4ff0aeb4659faba89851ae30a9e16ffb16471a36395aa160f390778 not found: ID does not exist" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.645042 4911 scope.go:117] "RemoveContainer" containerID="22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062" Dec 01 00:31:14 crc kubenswrapper[4911]: E1201 00:31:14.645302 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062\": container with ID starting with 22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062 not found: ID does not exist" containerID="22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062" Dec 01 00:31:14 crc kubenswrapper[4911]: I1201 00:31:14.645327 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062"} err="failed to get container status \"22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062\": rpc error: code = NotFound desc = could not find container \"22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062\": container with ID starting with 22698eecc80a11597c7831131951b257e0aa035756d9c51bbb729c0e5e5b2062 not found: ID does not exist" Dec 01 00:31:16 crc kubenswrapper[4911]: I1201 00:31:16.160972 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" path="/var/lib/kubelet/pods/86814268-6c92-4818-a3dc-63aeec84bdb6/volumes" Dec 01 00:31:20 crc kubenswrapper[4911]: I1201 00:31:20.605367 4911 generic.go:334] "Generic (PLEG): container finished" podID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerID="1f5c09adb8a60323da75470d716caf27c87cfc2eeb7a5880a68ad4cf5d42e20b" exitCode=0 Dec 01 00:31:20 crc kubenswrapper[4911]: I1201 00:31:20.605603 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerDied","Data":"1f5c09adb8a60323da75470d716caf27c87cfc2eeb7a5880a68ad4cf5d42e20b"} Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.841582 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.969641 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.969813 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.969911 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970012 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970094 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970182 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970268 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970389 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wzl4\" (UniqueName: \"kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970435 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970507 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970553 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.970619 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir\") pod \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\" (UID: \"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f\") " Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.971094 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.972058 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.972666 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.972779 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.973873 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.974383 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.986876 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.988892 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4" (OuterVolumeSpecName: "kube-api-access-8wzl4") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "kube-api-access-8wzl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.989004 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:31:21 crc kubenswrapper[4911]: I1201 00:31:21.989090 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079419 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079476 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079490 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079505 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079520 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079530 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wzl4\" (UniqueName: \"kubernetes.io/projected/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-kube-api-access-8wzl4\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079541 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079551 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079561 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.079571 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.314164 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.383165 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.621693 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f","Type":"ContainerDied","Data":"0289f5f94f1e5f0703962a7c2abcbd7c8d114725c0d4160d143929c5d8f89609"} Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.621748 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0289f5f94f1e5f0703962a7c2abcbd7c8d114725c0d4160d143929c5d8f89609" Dec 01 00:31:22 crc kubenswrapper[4911]: I1201 00:31:22.621825 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Dec 01 00:31:24 crc kubenswrapper[4911]: I1201 00:31:24.557397 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" (UID: "9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:24 crc kubenswrapper[4911]: I1201 00:31:24.611353 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.758916 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759538 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="registry-server" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759558 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="registry-server" Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759573 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="docker-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759581 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="docker-build" Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759591 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="extract-utilities" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759599 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="extract-utilities" Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759610 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="extract-content" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759620 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="extract-content" Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759635 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="manage-dockerfile" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759642 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="manage-dockerfile" Dec 01 00:31:26 crc kubenswrapper[4911]: E1201 00:31:26.759659 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="git-clone" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759668 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="git-clone" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759798 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="86814268-6c92-4818-a3dc-63aeec84bdb6" containerName="registry-server" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.759818 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbb3a5c-bba4-497d-a28e-e6e7d3d2244f" containerName="docker-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.760443 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.764894 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.765234 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.765247 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.765988 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.781409 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.949949 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.950477 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.950746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.950908 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951195 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951363 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951537 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951705 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951806 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.951931 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tpp\" (UniqueName: \"kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.952215 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:26 crc kubenswrapper[4911]: I1201 00:31:26.952337 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053351 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053403 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053434 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053453 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053512 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053528 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053566 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053603 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053631 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053646 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.053663 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tpp\" (UniqueName: \"kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.054090 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.054123 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.055281 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.055438 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.055701 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.055739 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.055717 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.056049 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.056643 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.061068 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.063323 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.072019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tpp\" (UniqueName: \"kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp\") pod \"sg-bridge-1-build\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.075603 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.282257 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:27 crc kubenswrapper[4911]: I1201 00:31:27.658397 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0b36d651-c713-4e6b-96bb-f4d81d36e289","Type":"ContainerStarted","Data":"025cb3368624b118411746f6194bfe138a858c3d1149713210eb1ef3e5716380"} Dec 01 00:31:29 crc kubenswrapper[4911]: I1201 00:31:29.672875 4911 generic.go:334] "Generic (PLEG): container finished" podID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerID="4cce0e123d9962eb9f42fcc386e1b09e9798ef130dda73f6fc547de64b0e3bd0" exitCode=0 Dec 01 00:31:29 crc kubenswrapper[4911]: I1201 00:31:29.673014 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0b36d651-c713-4e6b-96bb-f4d81d36e289","Type":"ContainerDied","Data":"4cce0e123d9962eb9f42fcc386e1b09e9798ef130dda73f6fc547de64b0e3bd0"} Dec 01 00:31:30 crc kubenswrapper[4911]: I1201 00:31:30.682181 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0b36d651-c713-4e6b-96bb-f4d81d36e289","Type":"ContainerStarted","Data":"59b58a4c3a5984a457b43adfe5c2a882a1da3138754939e4328685e65982aa40"} Dec 01 00:31:30 crc kubenswrapper[4911]: I1201 00:31:30.708591 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=4.708568778 podStartE2EDuration="4.708568778s" podCreationTimestamp="2025-12-01 00:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:31:30.706186251 +0000 UTC m=+1450.844883042" watchObservedRunningTime="2025-12-01 00:31:30.708568778 +0000 UTC m=+1450.847265559" Dec 01 00:31:36 crc kubenswrapper[4911]: I1201 00:31:36.977194 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:36 crc kubenswrapper[4911]: I1201 00:31:36.978129 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="docker-build" containerID="cri-o://59b58a4c3a5984a457b43adfe5c2a882a1da3138754939e4328685e65982aa40" gracePeriod=30 Dec 01 00:31:37 crc kubenswrapper[4911]: I1201 00:31:37.734655 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0b36d651-c713-4e6b-96bb-f4d81d36e289/docker-build/0.log" Dec 01 00:31:37 crc kubenswrapper[4911]: I1201 00:31:37.735189 4911 generic.go:334] "Generic (PLEG): container finished" podID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerID="59b58a4c3a5984a457b43adfe5c2a882a1da3138754939e4328685e65982aa40" exitCode=1 Dec 01 00:31:37 crc kubenswrapper[4911]: I1201 00:31:37.735225 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0b36d651-c713-4e6b-96bb-f4d81d36e289","Type":"ContainerDied","Data":"59b58a4c3a5984a457b43adfe5c2a882a1da3138754939e4328685e65982aa40"} Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.508654 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0b36d651-c713-4e6b-96bb-f4d81d36e289/docker-build/0.log" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.509433 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623245 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623324 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623355 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623410 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623449 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623502 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623535 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623575 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623610 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tpp\" (UniqueName: \"kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623641 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623631 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623666 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623708 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir\") pod \"0b36d651-c713-4e6b-96bb-f4d81d36e289\" (UID: \"0b36d651-c713-4e6b-96bb-f4d81d36e289\") " Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.623986 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.624155 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.626809 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.629913 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.630224 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.630718 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.631237 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.637591 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp" (OuterVolumeSpecName: "kube-api-access-l6tpp") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "kube-api-access-l6tpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.639586 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.640547 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.721868 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Dec 01 00:31:38 crc kubenswrapper[4911]: E1201 00:31:38.722174 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="manage-dockerfile" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.722188 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="manage-dockerfile" Dec 01 00:31:38 crc kubenswrapper[4911]: E1201 00:31:38.722204 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="docker-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.722212 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="docker-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.722332 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" containerName="docker-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.723162 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725350 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725379 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725389 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6tpp\" (UniqueName: \"kubernetes.io/projected/0b36d651-c713-4e6b-96bb-f4d81d36e289-kube-api-access-l6tpp\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725401 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725411 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725420 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725428 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725438 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.725446 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/0b36d651-c713-4e6b-96bb-f4d81d36e289-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.727905 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.728249 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.728473 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.734209 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.743657 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0b36d651-c713-4e6b-96bb-f4d81d36e289/docker-build/0.log" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.743998 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0b36d651-c713-4e6b-96bb-f4d81d36e289","Type":"ContainerDied","Data":"025cb3368624b118411746f6194bfe138a858c3d1149713210eb1ef3e5716380"} Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.744045 4911 scope.go:117] "RemoveContainer" containerID="59b58a4c3a5984a457b43adfe5c2a882a1da3138754939e4328685e65982aa40" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.744090 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.744170 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.788835 4911 scope.go:117] "RemoveContainer" containerID="4cce0e123d9962eb9f42fcc386e1b09e9798ef130dda73f6fc547de64b0e3bd0" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826404 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826446 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826481 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826499 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826518 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwbs\" (UniqueName: \"kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826551 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826586 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826605 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826669 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826725 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826746 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.826778 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928053 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwbs\" (UniqueName: \"kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928134 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928151 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928201 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928221 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928239 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928263 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928282 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928299 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928319 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928336 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928476 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928698 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928750 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.928756 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.929041 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.929145 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.929419 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.929431 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.930039 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.932193 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.932196 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:38 crc kubenswrapper[4911]: I1201 00:31:38.945958 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwbs\" (UniqueName: \"kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs\") pod \"sg-bridge-2-build\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.041298 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.068716 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0b36d651-c713-4e6b-96bb-f4d81d36e289" (UID: "0b36d651-c713-4e6b-96bb-f4d81d36e289"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.132453 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0b36d651-c713-4e6b-96bb-f4d81d36e289-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.273937 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.379294 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.385035 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.755231 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerStarted","Data":"18c5410a8deef6364356a8d1cde7a365584ca638a6a66f5a86600690820106ac"} Dec 01 00:31:39 crc kubenswrapper[4911]: I1201 00:31:39.755286 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerStarted","Data":"01973016bdbd6d32e57204dbf66b0b51df0d9cae1ca9ddae5c1a58ce8acd08a0"} Dec 01 00:31:40 crc kubenswrapper[4911]: I1201 00:31:40.174566 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b36d651-c713-4e6b-96bb-f4d81d36e289" path="/var/lib/kubelet/pods/0b36d651-c713-4e6b-96bb-f4d81d36e289/volumes" Dec 01 00:31:40 crc kubenswrapper[4911]: I1201 00:31:40.787965 4911 generic.go:334] "Generic (PLEG): container finished" podID="d2afad22-13ae-44bc-bd02-889ebe578133" containerID="18c5410a8deef6364356a8d1cde7a365584ca638a6a66f5a86600690820106ac" exitCode=0 Dec 01 00:31:40 crc kubenswrapper[4911]: I1201 00:31:40.788012 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerDied","Data":"18c5410a8deef6364356a8d1cde7a365584ca638a6a66f5a86600690820106ac"} Dec 01 00:31:41 crc kubenswrapper[4911]: I1201 00:31:41.796178 4911 generic.go:334] "Generic (PLEG): container finished" podID="d2afad22-13ae-44bc-bd02-889ebe578133" containerID="11a3136dae3e34bfe8ec2152686c6dba86264bea2e2f914963fd0d1b151737cf" exitCode=0 Dec 01 00:31:41 crc kubenswrapper[4911]: I1201 00:31:41.796240 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerDied","Data":"11a3136dae3e34bfe8ec2152686c6dba86264bea2e2f914963fd0d1b151737cf"} Dec 01 00:31:41 crc kubenswrapper[4911]: I1201 00:31:41.832208 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_d2afad22-13ae-44bc-bd02-889ebe578133/manage-dockerfile/0.log" Dec 01 00:31:42 crc kubenswrapper[4911]: I1201 00:31:42.803921 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerStarted","Data":"04bd477df099ce478dd4d1cc4d6370e4000215db763c49b79fa07376d66d0664"} Dec 01 00:31:42 crc kubenswrapper[4911]: I1201 00:31:42.830759 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.830737678 podStartE2EDuration="4.830737678s" podCreationTimestamp="2025-12-01 00:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:31:42.829293057 +0000 UTC m=+1462.967989828" watchObservedRunningTime="2025-12-01 00:31:42.830737678 +0000 UTC m=+1462.969434449" Dec 01 00:32:31 crc kubenswrapper[4911]: I1201 00:32:31.212304 4911 generic.go:334] "Generic (PLEG): container finished" podID="d2afad22-13ae-44bc-bd02-889ebe578133" containerID="04bd477df099ce478dd4d1cc4d6370e4000215db763c49b79fa07376d66d0664" exitCode=0 Dec 01 00:32:31 crc kubenswrapper[4911]: I1201 00:32:31.212558 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerDied","Data":"04bd477df099ce478dd4d1cc4d6370e4000215db763c49b79fa07376d66d0664"} Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.493957 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.627865 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.627942 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628017 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628051 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628209 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628259 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628288 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628303 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628341 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwwbs\" (UniqueName: \"kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628382 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628419 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628870 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628921 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.628988 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.629329 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.629347 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.630207 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.630523 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.633839 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.633954 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.634046 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs" (OuterVolumeSpecName: "kube-api-access-nwwbs") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "kube-api-access-nwwbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.730978 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731049 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731060 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731073 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d2afad22-13ae-44bc-bd02-889ebe578133-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731084 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731095 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731106 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2afad22-13ae-44bc-bd02-889ebe578133-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731116 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwwbs\" (UniqueName: \"kubernetes.io/projected/d2afad22-13ae-44bc-bd02-889ebe578133-kube-api-access-nwwbs\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731126 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.731136 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2afad22-13ae-44bc-bd02-889ebe578133-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.753400 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:32 crc kubenswrapper[4911]: I1201 00:32:32.832299 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.231737 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"d2afad22-13ae-44bc-bd02-889ebe578133","Type":"ContainerDied","Data":"01973016bdbd6d32e57204dbf66b0b51df0d9cae1ca9ddae5c1a58ce8acd08a0"} Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.231777 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01973016bdbd6d32e57204dbf66b0b51df0d9cae1ca9ddae5c1a58ce8acd08a0" Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.231804 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.439832 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.440246 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") pod \"d2afad22-13ae-44bc-bd02-889ebe578133\" (UID: \"d2afad22-13ae-44bc-bd02-889ebe578133\") " Dec 01 00:32:33 crc kubenswrapper[4911]: W1201 00:32:33.440477 4911 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d2afad22-13ae-44bc-bd02-889ebe578133/volumes/kubernetes.io~empty-dir/container-storage-root Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.440495 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d2afad22-13ae-44bc-bd02-889ebe578133" (UID: "d2afad22-13ae-44bc-bd02-889ebe578133"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:33 crc kubenswrapper[4911]: I1201 00:32:33.440764 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2afad22-13ae-44bc-bd02-889ebe578133-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.353046 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:38 crc kubenswrapper[4911]: E1201 00:32:38.353770 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="manage-dockerfile" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.353791 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="manage-dockerfile" Dec 01 00:32:38 crc kubenswrapper[4911]: E1201 00:32:38.353815 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="git-clone" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.353826 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="git-clone" Dec 01 00:32:38 crc kubenswrapper[4911]: E1201 00:32:38.353842 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="docker-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.353854 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="docker-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.354048 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2afad22-13ae-44bc-bd02-889ebe578133" containerName="docker-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.355015 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.357678 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.357687 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.357760 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.360427 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.369284 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511118 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511199 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511226 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511252 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511296 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511316 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511369 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511393 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511412 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511434 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkcvp\" (UniqueName: \"kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511569 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.511607 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613251 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613311 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613335 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613357 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkcvp\" (UniqueName: \"kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613405 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613422 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613428 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613454 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613624 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613678 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613746 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613867 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613875 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.613957 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.614127 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.614263 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.614298 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.614678 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.615367 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.615779 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.615890 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.619392 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.619698 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.632183 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkcvp\" (UniqueName: \"kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.672177 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:38 crc kubenswrapper[4911]: I1201 00:32:38.880200 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:39 crc kubenswrapper[4911]: I1201 00:32:39.275668 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerStarted","Data":"176db9a3278492a7e93cfcb3113f40f5b38e8622518dda64181b0e5762a13129"} Dec 01 00:32:39 crc kubenswrapper[4911]: I1201 00:32:39.275974 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerStarted","Data":"3279b29c98bfbcaf5c0ad28f148d734f2f91fe39561b899e2485a8e8fc7ade9b"} Dec 01 00:32:40 crc kubenswrapper[4911]: I1201 00:32:40.284353 4911 generic.go:334] "Generic (PLEG): container finished" podID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerID="176db9a3278492a7e93cfcb3113f40f5b38e8622518dda64181b0e5762a13129" exitCode=0 Dec 01 00:32:40 crc kubenswrapper[4911]: I1201 00:32:40.284406 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerDied","Data":"176db9a3278492a7e93cfcb3113f40f5b38e8622518dda64181b0e5762a13129"} Dec 01 00:32:41 crc kubenswrapper[4911]: I1201 00:32:41.299138 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerStarted","Data":"5177f14db6681e00da853852c4577d224803bb6ff8757ac3922049ffbce3beb4"} Dec 01 00:32:41 crc kubenswrapper[4911]: I1201 00:32:41.331898 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.331879245 podStartE2EDuration="3.331879245s" podCreationTimestamp="2025-12-01 00:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:32:41.324744734 +0000 UTC m=+1521.463441505" watchObservedRunningTime="2025-12-01 00:32:41.331879245 +0000 UTC m=+1521.470576016" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.199671 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.200295 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="docker-build" containerID="cri-o://5177f14db6681e00da853852c4577d224803bb6ff8757ac3922049ffbce3beb4" gracePeriod=30 Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.350218 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_d782a9f8-1a65-44c0-9737-6f63f26ba120/docker-build/0.log" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.351209 4911 generic.go:334] "Generic (PLEG): container finished" podID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerID="5177f14db6681e00da853852c4577d224803bb6ff8757ac3922049ffbce3beb4" exitCode=1 Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.351242 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerDied","Data":"5177f14db6681e00da853852c4577d224803bb6ff8757ac3922049ffbce3beb4"} Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.575147 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_d782a9f8-1a65-44c0-9737-6f63f26ba120/docker-build/0.log" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.575658 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660498 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660561 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660582 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660601 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660640 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660683 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkcvp\" (UniqueName: \"kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660704 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660725 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660749 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660781 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.660798 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir\") pod \"d782a9f8-1a65-44c0-9737-6f63f26ba120\" (UID: \"d782a9f8-1a65-44c0-9737-6f63f26ba120\") " Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.661078 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.661513 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.662299 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.662530 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.662930 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.663358 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.667532 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.669024 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.674716 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp" (OuterVolumeSpecName: "kube-api-access-vkcvp") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "kube-api-access-vkcvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.679670 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.732180 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762332 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkcvp\" (UniqueName: \"kubernetes.io/projected/d782a9f8-1a65-44c0-9737-6f63f26ba120-kube-api-access-vkcvp\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762367 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762395 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762406 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762418 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762428 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762437 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762445 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762453 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/d782a9f8-1a65-44c0-9737-6f63f26ba120-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762460 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:48 crc kubenswrapper[4911]: I1201 00:32:48.762468 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d782a9f8-1a65-44c0-9737-6f63f26ba120-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.003022 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d782a9f8-1a65-44c0-9737-6f63f26ba120" (UID: "d782a9f8-1a65-44c0-9737-6f63f26ba120"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.066709 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d782a9f8-1a65-44c0-9737-6f63f26ba120-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.362172 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_d782a9f8-1a65-44c0-9737-6f63f26ba120/docker-build/0.log" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.363468 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"d782a9f8-1a65-44c0-9737-6f63f26ba120","Type":"ContainerDied","Data":"3279b29c98bfbcaf5c0ad28f148d734f2f91fe39561b899e2485a8e8fc7ade9b"} Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.363599 4911 scope.go:117] "RemoveContainer" containerID="5177f14db6681e00da853852c4577d224803bb6ff8757ac3922049ffbce3beb4" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.363538 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.397074 4911 scope.go:117] "RemoveContainer" containerID="176db9a3278492a7e93cfcb3113f40f5b38e8622518dda64181b0e5762a13129" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.410018 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.431825 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.861267 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Dec 01 00:32:49 crc kubenswrapper[4911]: E1201 00:32:49.861540 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="docker-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.861553 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="docker-build" Dec 01 00:32:49 crc kubenswrapper[4911]: E1201 00:32:49.861575 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="manage-dockerfile" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.861583 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="manage-dockerfile" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.861701 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" containerName="docker-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.862543 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.867545 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.867773 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.867869 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.868078 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.879161 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990678 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990728 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990800 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990853 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990905 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990935 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.990996 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.991135 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnj2s\" (UniqueName: \"kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.991263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.991346 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.991410 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:49 crc kubenswrapper[4911]: I1201 00:32:49.991516 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.092838 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.092893 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.092918 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.092989 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093036 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093052 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093076 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093106 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093141 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093172 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093200 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093234 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnj2s\" (UniqueName: \"kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093267 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093377 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093620 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093626 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093860 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093882 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.093958 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.094280 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.094290 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.097869 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.098702 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.111680 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnj2s\" (UniqueName: \"kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.159292 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d782a9f8-1a65-44c0-9737-6f63f26ba120" path="/var/lib/kubelet/pods/d782a9f8-1a65-44c0-9737-6f63f26ba120/volumes" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.176486 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:32:50 crc kubenswrapper[4911]: I1201 00:32:50.382819 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Dec 01 00:32:51 crc kubenswrapper[4911]: I1201 00:32:51.387851 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerStarted","Data":"a566fb7cdb64151a66b446abb46655ebc5a4702c0f54acad3e786a88dfbb3ee4"} Dec 01 00:32:51 crc kubenswrapper[4911]: I1201 00:32:51.388226 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerStarted","Data":"edd66827aa1fca9bb88e4c1267319b3d6c9b3448c0f66223824e10b3d2316276"} Dec 01 00:32:52 crc kubenswrapper[4911]: I1201 00:32:52.397301 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerID="a566fb7cdb64151a66b446abb46655ebc5a4702c0f54acad3e786a88dfbb3ee4" exitCode=0 Dec 01 00:32:52 crc kubenswrapper[4911]: I1201 00:32:52.397384 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerDied","Data":"a566fb7cdb64151a66b446abb46655ebc5a4702c0f54acad3e786a88dfbb3ee4"} Dec 01 00:32:53 crc kubenswrapper[4911]: I1201 00:32:53.407895 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerID="291079c27dcd98318fd90fa884c65415bb66751462f7294ca0679567131fe47d" exitCode=0 Dec 01 00:32:53 crc kubenswrapper[4911]: I1201 00:32:53.407968 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerDied","Data":"291079c27dcd98318fd90fa884c65415bb66751462f7294ca0679567131fe47d"} Dec 01 00:32:53 crc kubenswrapper[4911]: I1201 00:32:53.445887 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_ac356dd7-4910-48aa-9044-7a1bd0d59785/manage-dockerfile/0.log" Dec 01 00:32:54 crc kubenswrapper[4911]: I1201 00:32:54.416958 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerStarted","Data":"7e380b8e77b5a54453eed08c43202295943085268ac5e315175e53ffd32aaf8f"} Dec 01 00:32:54 crc kubenswrapper[4911]: I1201 00:32:54.449607 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.449489621 podStartE2EDuration="5.449489621s" podCreationTimestamp="2025-12-01 00:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:32:54.448206345 +0000 UTC m=+1534.586903136" watchObservedRunningTime="2025-12-01 00:32:54.449489621 +0000 UTC m=+1534.588186402" Dec 01 00:33:21 crc kubenswrapper[4911]: I1201 00:33:21.312395 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:33:21 crc kubenswrapper[4911]: I1201 00:33:21.312985 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.241218 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.243427 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.258268 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.293972 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.294053 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62k8\" (UniqueName: \"kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.294132 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.395773 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.395845 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62k8\" (UniqueName: \"kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.395878 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.396390 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.396475 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.416580 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62k8\" (UniqueName: \"kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8\") pod \"certified-operators-vhp5n\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:40 crc kubenswrapper[4911]: I1201 00:33:40.576426 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:41 crc kubenswrapper[4911]: I1201 00:33:41.057586 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:41 crc kubenswrapper[4911]: I1201 00:33:41.911276 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerID="2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8" exitCode=0 Dec 01 00:33:41 crc kubenswrapper[4911]: I1201 00:33:41.911366 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerDied","Data":"2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8"} Dec 01 00:33:41 crc kubenswrapper[4911]: I1201 00:33:41.911989 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerStarted","Data":"d75b264c425c8d3cc344f29b162b4bfefacb681b148be02661741fd8cfe195c6"} Dec 01 00:33:42 crc kubenswrapper[4911]: I1201 00:33:42.919782 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerStarted","Data":"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976"} Dec 01 00:33:43 crc kubenswrapper[4911]: I1201 00:33:43.935169 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerDied","Data":"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976"} Dec 01 00:33:43 crc kubenswrapper[4911]: I1201 00:33:43.935014 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerID="b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976" exitCode=0 Dec 01 00:33:44 crc kubenswrapper[4911]: I1201 00:33:44.957866 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerStarted","Data":"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8"} Dec 01 00:33:44 crc kubenswrapper[4911]: I1201 00:33:44.974981 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vhp5n" podStartSLOduration=2.509424858 podStartE2EDuration="4.974963542s" podCreationTimestamp="2025-12-01 00:33:40 +0000 UTC" firstStartedPulling="2025-12-01 00:33:41.913357924 +0000 UTC m=+1582.052054695" lastFinishedPulling="2025-12-01 00:33:44.378896608 +0000 UTC m=+1584.517593379" observedRunningTime="2025-12-01 00:33:44.973178922 +0000 UTC m=+1585.111875723" watchObservedRunningTime="2025-12-01 00:33:44.974963542 +0000 UTC m=+1585.113660313" Dec 01 00:33:50 crc kubenswrapper[4911]: I1201 00:33:50.576842 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:50 crc kubenswrapper[4911]: I1201 00:33:50.577713 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:50 crc kubenswrapper[4911]: I1201 00:33:50.646357 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:51 crc kubenswrapper[4911]: I1201 00:33:51.049089 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:51 crc kubenswrapper[4911]: I1201 00:33:51.097016 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:51 crc kubenswrapper[4911]: I1201 00:33:51.311638 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:33:51 crc kubenswrapper[4911]: I1201 00:33:51.311712 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.028089 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerID="7e380b8e77b5a54453eed08c43202295943085268ac5e315175e53ffd32aaf8f" exitCode=0 Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.028156 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerDied","Data":"7e380b8e77b5a54453eed08c43202295943085268ac5e315175e53ffd32aaf8f"} Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.028327 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vhp5n" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="registry-server" containerID="cri-o://8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8" gracePeriod=2 Dec 01 00:33:53 crc kubenswrapper[4911]: E1201 00:33:53.108896 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ad925e_671c_430a_b4fd_5c144fe69531.slice/crio-conmon-8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8.scope\": RecentStats: unable to find data in memory cache]" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.387492 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.493769 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62k8\" (UniqueName: \"kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8\") pod \"e5ad925e-671c-430a-b4fd-5c144fe69531\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.493886 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities\") pod \"e5ad925e-671c-430a-b4fd-5c144fe69531\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.493985 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content\") pod \"e5ad925e-671c-430a-b4fd-5c144fe69531\" (UID: \"e5ad925e-671c-430a-b4fd-5c144fe69531\") " Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.496315 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities" (OuterVolumeSpecName: "utilities") pod "e5ad925e-671c-430a-b4fd-5c144fe69531" (UID: "e5ad925e-671c-430a-b4fd-5c144fe69531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.501476 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8" (OuterVolumeSpecName: "kube-api-access-g62k8") pod "e5ad925e-671c-430a-b4fd-5c144fe69531" (UID: "e5ad925e-671c-430a-b4fd-5c144fe69531"). InnerVolumeSpecName "kube-api-access-g62k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.546855 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5ad925e-671c-430a-b4fd-5c144fe69531" (UID: "e5ad925e-671c-430a-b4fd-5c144fe69531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.598186 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62k8\" (UniqueName: \"kubernetes.io/projected/e5ad925e-671c-430a-b4fd-5c144fe69531-kube-api-access-g62k8\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.598232 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:53 crc kubenswrapper[4911]: I1201 00:33:53.598248 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ad925e-671c-430a-b4fd-5c144fe69531-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.036727 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerID="8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8" exitCode=0 Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.036803 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerDied","Data":"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8"} Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.036844 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhp5n" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.037302 4911 scope.go:117] "RemoveContainer" containerID="8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.037879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhp5n" event={"ID":"e5ad925e-671c-430a-b4fd-5c144fe69531","Type":"ContainerDied","Data":"d75b264c425c8d3cc344f29b162b4bfefacb681b148be02661741fd8cfe195c6"} Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.060352 4911 scope.go:117] "RemoveContainer" containerID="b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.089053 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.097668 4911 scope.go:117] "RemoveContainer" containerID="2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.097722 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vhp5n"] Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.112442 4911 scope.go:117] "RemoveContainer" containerID="8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8" Dec 01 00:33:54 crc kubenswrapper[4911]: E1201 00:33:54.113013 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8\": container with ID starting with 8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8 not found: ID does not exist" containerID="8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.113044 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8"} err="failed to get container status \"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8\": rpc error: code = NotFound desc = could not find container \"8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8\": container with ID starting with 8df4d688dec69683be4d6d7936156959f2e0517317331982ee73ec0735ff27d8 not found: ID does not exist" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.113067 4911 scope.go:117] "RemoveContainer" containerID="b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976" Dec 01 00:33:54 crc kubenswrapper[4911]: E1201 00:33:54.113551 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976\": container with ID starting with b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976 not found: ID does not exist" containerID="b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.113580 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976"} err="failed to get container status \"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976\": rpc error: code = NotFound desc = could not find container \"b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976\": container with ID starting with b0611326323570c44b8296aa14bb3f4e3324d46e77cb54d3b69f6e396723e976 not found: ID does not exist" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.113594 4911 scope.go:117] "RemoveContainer" containerID="2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8" Dec 01 00:33:54 crc kubenswrapper[4911]: E1201 00:33:54.114499 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8\": container with ID starting with 2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8 not found: ID does not exist" containerID="2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.114528 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8"} err="failed to get container status \"2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8\": rpc error: code = NotFound desc = could not find container \"2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8\": container with ID starting with 2f6ab694852ebdd03f42fa986546d69f12571a8e700d8933a0650636588d91a8 not found: ID does not exist" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.164164 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" path="/var/lib/kubelet/pods/e5ad925e-671c-430a-b4fd-5c144fe69531/volumes" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.334237 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510143 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510189 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510214 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510246 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510284 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510431 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510520 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510543 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510566 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.510643 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnj2s\" (UniqueName: \"kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.511365 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.511748 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.511790 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.511818 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.513179 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.513562 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir\") pod \"ac356dd7-4910-48aa-9044-7a1bd0d59785\" (UID: \"ac356dd7-4910-48aa-9044-7a1bd0d59785\") " Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.513640 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.513917 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514385 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514410 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514425 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514440 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514482 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514500 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.514511 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac356dd7-4910-48aa-9044-7a1bd0d59785-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.516539 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.516956 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s" (OuterVolumeSpecName: "kube-api-access-vnj2s") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "kube-api-access-vnj2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.518081 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.616179 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnj2s\" (UniqueName: \"kubernetes.io/projected/ac356dd7-4910-48aa-9044-7a1bd0d59785-kube-api-access-vnj2s\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.616634 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.616667 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac356dd7-4910-48aa-9044-7a1bd0d59785-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.769669 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:54 crc kubenswrapper[4911]: I1201 00:33:54.819664 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:33:55 crc kubenswrapper[4911]: I1201 00:33:55.050843 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"ac356dd7-4910-48aa-9044-7a1bd0d59785","Type":"ContainerDied","Data":"edd66827aa1fca9bb88e4c1267319b3d6c9b3448c0f66223824e10b3d2316276"} Dec 01 00:33:55 crc kubenswrapper[4911]: I1201 00:33:55.050918 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edd66827aa1fca9bb88e4c1267319b3d6c9b3448c0f66223824e10b3d2316276" Dec 01 00:33:55 crc kubenswrapper[4911]: I1201 00:33:55.051040 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Dec 01 00:33:55 crc kubenswrapper[4911]: I1201 00:33:55.422739 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ac356dd7-4910-48aa-9044-7a1bd0d59785" (UID: "ac356dd7-4910-48aa-9044-7a1bd0d59785"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:33:55 crc kubenswrapper[4911]: I1201 00:33:55.431756 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac356dd7-4910-48aa-9044-7a1bd0d59785-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.889442 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890727 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="registry-server" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890749 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="registry-server" Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890765 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="extract-utilities" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890774 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="extract-utilities" Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890807 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="manage-dockerfile" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890818 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="manage-dockerfile" Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890836 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="docker-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890846 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="docker-build" Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890861 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="git-clone" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890869 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="git-clone" Dec 01 00:34:03 crc kubenswrapper[4911]: E1201 00:34:03.890880 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="extract-content" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.890887 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="extract-content" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.891052 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ad925e-671c-430a-b4fd-5c144fe69531" containerName="registry-server" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.891068 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac356dd7-4910-48aa-9044-7a1bd0d59785" containerName="docker-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.892268 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.894974 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.895109 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.895154 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.895211 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.908063 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.943648 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.943708 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.943739 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.943760 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:03 crc kubenswrapper[4911]: I1201 00:34:03.943953 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.044907 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.044988 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045013 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045031 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045057 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045080 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045102 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045118 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045142 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045205 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045508 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7fwf\" (UniqueName: \"kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045699 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045741 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045839 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.045942 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.051247 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.051253 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.146979 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147572 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147629 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147681 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147733 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147767 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147876 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.147946 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7fwf\" (UniqueName: \"kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.148133 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.148215 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.148238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.148642 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.148968 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.166970 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7fwf\" (UniqueName: \"kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.219824 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:04 crc kubenswrapper[4911]: I1201 00:34:04.402613 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Dec 01 00:34:05 crc kubenswrapper[4911]: I1201 00:34:05.134265 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"21c11afc-2069-4b2b-b620-98214dde9cec","Type":"ContainerStarted","Data":"a5c45ac710eb6fc2c535d03fba3059dd30b777622fcae3b9c340d60d4e768bc9"} Dec 01 00:34:06 crc kubenswrapper[4911]: I1201 00:34:06.153582 4911 generic.go:334] "Generic (PLEG): container finished" podID="21c11afc-2069-4b2b-b620-98214dde9cec" containerID="f006f4209131e0c4eaf3adce194c71a64279499ef4eb11160ecdfa89147af76d" exitCode=0 Dec 01 00:34:06 crc kubenswrapper[4911]: I1201 00:34:06.159536 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"21c11afc-2069-4b2b-b620-98214dde9cec","Type":"ContainerDied","Data":"f006f4209131e0c4eaf3adce194c71a64279499ef4eb11160ecdfa89147af76d"} Dec 01 00:34:07 crc kubenswrapper[4911]: I1201 00:34:07.164195 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_21c11afc-2069-4b2b-b620-98214dde9cec/docker-build/0.log" Dec 01 00:34:07 crc kubenswrapper[4911]: I1201 00:34:07.165855 4911 generic.go:334] "Generic (PLEG): container finished" podID="21c11afc-2069-4b2b-b620-98214dde9cec" containerID="d8032d4133d583694274d940c88b8d4fc82b447d0f789cedb3ba9c845a865519" exitCode=1 Dec 01 00:34:07 crc kubenswrapper[4911]: I1201 00:34:07.166059 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"21c11afc-2069-4b2b-b620-98214dde9cec","Type":"ContainerDied","Data":"d8032d4133d583694274d940c88b8d4fc82b447d0f789cedb3ba9c845a865519"} Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.410947 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_21c11afc-2069-4b2b-b620-98214dde9cec/docker-build/0.log" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.411393 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.496843 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.496937 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.496971 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.497961 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.499282 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.504260 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.597605 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598044 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598075 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598172 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598205 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598240 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7fwf\" (UniqueName: \"kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598275 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598303 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598322 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs\") pod \"21c11afc-2069-4b2b-b620-98214dde9cec\" (UID: \"21c11afc-2069-4b2b-b620-98214dde9cec\") " Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598715 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598750 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.598769 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.599069 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.599122 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.599662 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.600811 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.601117 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.601802 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.602769 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.603537 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.605095 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf" (OuterVolumeSpecName: "kube-api-access-p7fwf") pod "21c11afc-2069-4b2b-b620-98214dde9cec" (UID: "21c11afc-2069-4b2b-b620-98214dde9cec"). InnerVolumeSpecName "kube-api-access-p7fwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699879 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699916 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699928 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699941 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699954 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7fwf\" (UniqueName: \"kubernetes.io/projected/21c11afc-2069-4b2b-b620-98214dde9cec-kube-api-access-p7fwf\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699966 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/21c11afc-2069-4b2b-b620-98214dde9cec-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699976 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21c11afc-2069-4b2b-b620-98214dde9cec-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699987 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/21c11afc-2069-4b2b-b620-98214dde9cec-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:08 crc kubenswrapper[4911]: I1201 00:34:08.699999 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/21c11afc-2069-4b2b-b620-98214dde9cec-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:09 crc kubenswrapper[4911]: I1201 00:34:09.181695 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_21c11afc-2069-4b2b-b620-98214dde9cec/docker-build/0.log" Dec 01 00:34:09 crc kubenswrapper[4911]: I1201 00:34:09.182432 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"21c11afc-2069-4b2b-b620-98214dde9cec","Type":"ContainerDied","Data":"a5c45ac710eb6fc2c535d03fba3059dd30b777622fcae3b9c340d60d4e768bc9"} Dec 01 00:34:09 crc kubenswrapper[4911]: I1201 00:34:09.182495 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c45ac710eb6fc2c535d03fba3059dd30b777622fcae3b9c340d60d4e768bc9" Dec 01 00:34:09 crc kubenswrapper[4911]: I1201 00:34:09.182498 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Dec 01 00:34:14 crc kubenswrapper[4911]: I1201 00:34:14.395413 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Dec 01 00:34:14 crc kubenswrapper[4911]: I1201 00:34:14.401089 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.008074 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Dec 01 00:34:16 crc kubenswrapper[4911]: E1201 00:34:16.008742 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" containerName="docker-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.008759 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" containerName="docker-build" Dec 01 00:34:16 crc kubenswrapper[4911]: E1201 00:34:16.008789 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" containerName="manage-dockerfile" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.008798 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" containerName="manage-dockerfile" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.008919 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" containerName="docker-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.010237 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.012184 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.012294 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.012719 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.014083 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.038032 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.065875 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.065915 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.065936 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066007 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066045 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066094 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066115 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066161 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066215 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtqb\" (UniqueName: \"kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066240 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.066293 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.164759 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c11afc-2069-4b2b-b620-98214dde9cec" path="/var/lib/kubelet/pods/21c11afc-2069-4b2b-b620-98214dde9cec/volumes" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.167350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.167413 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.167632 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168028 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168159 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168292 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtqb\" (UniqueName: \"kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168350 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168390 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168655 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168753 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168789 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168828 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168858 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.168964 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.169238 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.169295 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.169379 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.169441 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.169939 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.170961 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.171351 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.174059 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.174058 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.189114 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtqb\" (UniqueName: \"kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.345531 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:16 crc kubenswrapper[4911]: I1201 00:34:16.576414 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Dec 01 00:34:17 crc kubenswrapper[4911]: I1201 00:34:17.253085 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerStarted","Data":"a5b12b34d1f9ce952de7e2ff82eb244c41cf10c9298947a8d6b31b4e37bc141d"} Dec 01 00:34:17 crc kubenswrapper[4911]: I1201 00:34:17.253359 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerStarted","Data":"7b64d80530747a53810edd3f14fb9d0722aac716f237194ad1940b840311a7a1"} Dec 01 00:34:18 crc kubenswrapper[4911]: I1201 00:34:18.262821 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac34576e-455c-46af-925e-379eee643edf" containerID="a5b12b34d1f9ce952de7e2ff82eb244c41cf10c9298947a8d6b31b4e37bc141d" exitCode=0 Dec 01 00:34:18 crc kubenswrapper[4911]: I1201 00:34:18.262879 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerDied","Data":"a5b12b34d1f9ce952de7e2ff82eb244c41cf10c9298947a8d6b31b4e37bc141d"} Dec 01 00:34:19 crc kubenswrapper[4911]: I1201 00:34:19.273520 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac34576e-455c-46af-925e-379eee643edf" containerID="c982ffb2e5db34904e1f0ef5efb96c6715d036fa8827b459b5b47581acbd032f" exitCode=0 Dec 01 00:34:19 crc kubenswrapper[4911]: I1201 00:34:19.273848 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerDied","Data":"c982ffb2e5db34904e1f0ef5efb96c6715d036fa8827b459b5b47581acbd032f"} Dec 01 00:34:19 crc kubenswrapper[4911]: I1201 00:34:19.332820 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_ac34576e-455c-46af-925e-379eee643edf/manage-dockerfile/0.log" Dec 01 00:34:20 crc kubenswrapper[4911]: I1201 00:34:20.287773 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerStarted","Data":"b33cffd82580366ae5c99ed5114c6cd2b885c625e385119a93479767df3697af"} Dec 01 00:34:20 crc kubenswrapper[4911]: I1201 00:34:20.320015 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=5.319992631 podStartE2EDuration="5.319992631s" podCreationTimestamp="2025-12-01 00:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:34:20.318833019 +0000 UTC m=+1620.457529800" watchObservedRunningTime="2025-12-01 00:34:20.319992631 +0000 UTC m=+1620.458689412" Dec 01 00:34:21 crc kubenswrapper[4911]: I1201 00:34:21.312080 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:34:21 crc kubenswrapper[4911]: I1201 00:34:21.312143 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:34:21 crc kubenswrapper[4911]: I1201 00:34:21.312189 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:34:21 crc kubenswrapper[4911]: I1201 00:34:21.312999 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:34:21 crc kubenswrapper[4911]: I1201 00:34:21.313084 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" gracePeriod=600 Dec 01 00:34:21 crc kubenswrapper[4911]: E1201 00:34:21.465950 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:34:22 crc kubenswrapper[4911]: I1201 00:34:22.305259 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" exitCode=0 Dec 01 00:34:22 crc kubenswrapper[4911]: I1201 00:34:22.305328 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097"} Dec 01 00:34:22 crc kubenswrapper[4911]: I1201 00:34:22.305664 4911 scope.go:117] "RemoveContainer" containerID="ee4df74b53d9f3dd3ba8fcd693c318ddcec3ad0d37918e7de6e25ea516415ccd" Dec 01 00:34:22 crc kubenswrapper[4911]: I1201 00:34:22.306157 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:34:22 crc kubenswrapper[4911]: E1201 00:34:22.306395 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:34:23 crc kubenswrapper[4911]: I1201 00:34:23.315120 4911 generic.go:334] "Generic (PLEG): container finished" podID="ac34576e-455c-46af-925e-379eee643edf" containerID="b33cffd82580366ae5c99ed5114c6cd2b885c625e385119a93479767df3697af" exitCode=0 Dec 01 00:34:23 crc kubenswrapper[4911]: I1201 00:34:23.315199 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerDied","Data":"b33cffd82580366ae5c99ed5114c6cd2b885c625e385119a93479767df3697af"} Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.577204 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627259 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627421 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627482 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627529 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627590 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627623 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627645 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627686 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627712 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627811 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627841 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdtqb\" (UniqueName: \"kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.627872 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull\") pod \"ac34576e-455c-46af-925e-379eee643edf\" (UID: \"ac34576e-455c-46af-925e-379eee643edf\") " Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.632166 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.632699 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.632729 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.634041 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.634142 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.635282 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.636160 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.638414 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.647379 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.655731 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.655733 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb" (OuterVolumeSpecName: "kube-api-access-qdtqb") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "kube-api-access-qdtqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.655841 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "ac34576e-455c-46af-925e-379eee643edf" (UID: "ac34576e-455c-46af-925e-379eee643edf"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.759987 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760046 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdtqb\" (UniqueName: \"kubernetes.io/projected/ac34576e-455c-46af-925e-379eee643edf-kube-api-access-qdtqb\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760063 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760078 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760091 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760101 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760111 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760121 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760131 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac34576e-455c-46af-925e-379eee643edf-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760140 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac34576e-455c-46af-925e-379eee643edf-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760151 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/ac34576e-455c-46af-925e-379eee643edf-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:24 crc kubenswrapper[4911]: I1201 00:34:24.760161 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ac34576e-455c-46af-925e-379eee643edf-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:25 crc kubenswrapper[4911]: I1201 00:34:25.340651 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"ac34576e-455c-46af-925e-379eee643edf","Type":"ContainerDied","Data":"7b64d80530747a53810edd3f14fb9d0722aac716f237194ad1940b840311a7a1"} Dec 01 00:34:25 crc kubenswrapper[4911]: I1201 00:34:25.340708 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b64d80530747a53810edd3f14fb9d0722aac716f237194ad1940b840311a7a1" Dec 01 00:34:25 crc kubenswrapper[4911]: I1201 00:34:25.340835 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.508406 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Dec 01 00:34:29 crc kubenswrapper[4911]: E1201 00:34:29.509409 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="git-clone" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.509429 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="git-clone" Dec 01 00:34:29 crc kubenswrapper[4911]: E1201 00:34:29.509455 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="manage-dockerfile" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.509526 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="manage-dockerfile" Dec 01 00:34:29 crc kubenswrapper[4911]: E1201 00:34:29.509542 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="docker-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.509553 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="docker-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.509726 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac34576e-455c-46af-925e-379eee643edf" containerName="docker-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.510665 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.512669 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.512883 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.512679 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.514426 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.545159 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626567 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626624 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626651 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626702 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626727 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626756 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626815 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626853 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.626949 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.627001 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpk8\" (UniqueName: \"kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.627037 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.627126 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.729110 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.729172 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpk8\" (UniqueName: \"kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.729417 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.729592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.729679 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.730051 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.730192 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.730366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.730742 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.732562 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.730838 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.733204 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.733311 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734544 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734610 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734655 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734691 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734730 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.734807 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.735290 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.735355 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.742131 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.742182 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.746837 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpk8\" (UniqueName: \"kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:29 crc kubenswrapper[4911]: I1201 00:34:29.826771 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:30 crc kubenswrapper[4911]: I1201 00:34:30.080093 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Dec 01 00:34:30 crc kubenswrapper[4911]: I1201 00:34:30.381170 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"f3df01f1-f02c-45da-b645-dd04a702516a","Type":"ContainerStarted","Data":"7ac1d8ba38f4f724dc6f314b2509a8f178d12f93538cc091da7cb88c10845f41"} Dec 01 00:34:30 crc kubenswrapper[4911]: I1201 00:34:30.381223 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"f3df01f1-f02c-45da-b645-dd04a702516a","Type":"ContainerStarted","Data":"8d207c7f6ec32415785c7467b1070293b6ae27f37d30017dc2130f09c29e0064"} Dec 01 00:34:31 crc kubenswrapper[4911]: I1201 00:34:31.390431 4911 generic.go:334] "Generic (PLEG): container finished" podID="f3df01f1-f02c-45da-b645-dd04a702516a" containerID="7ac1d8ba38f4f724dc6f314b2509a8f178d12f93538cc091da7cb88c10845f41" exitCode=0 Dec 01 00:34:31 crc kubenswrapper[4911]: I1201 00:34:31.390513 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"f3df01f1-f02c-45da-b645-dd04a702516a","Type":"ContainerDied","Data":"7ac1d8ba38f4f724dc6f314b2509a8f178d12f93538cc091da7cb88c10845f41"} Dec 01 00:34:32 crc kubenswrapper[4911]: I1201 00:34:32.401993 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_f3df01f1-f02c-45da-b645-dd04a702516a/docker-build/0.log" Dec 01 00:34:32 crc kubenswrapper[4911]: I1201 00:34:32.402768 4911 generic.go:334] "Generic (PLEG): container finished" podID="f3df01f1-f02c-45da-b645-dd04a702516a" containerID="b1a7af46c8c53178acabb269bd37c81e0ab7744a6405b22a0f744963654d844d" exitCode=1 Dec 01 00:34:32 crc kubenswrapper[4911]: I1201 00:34:32.402807 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"f3df01f1-f02c-45da-b645-dd04a702516a","Type":"ContainerDied","Data":"b1a7af46c8c53178acabb269bd37c81e0ab7744a6405b22a0f744963654d844d"} Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.715553 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_f3df01f1-f02c-45da-b645-dd04a702516a/docker-build/0.log" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.716259 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886563 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886655 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886683 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886725 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886745 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886765 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886803 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886818 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886836 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886875 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886901 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpk8\" (UniqueName: \"kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.886937 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir\") pod \"f3df01f1-f02c-45da-b645-dd04a702516a\" (UID: \"f3df01f1-f02c-45da-b645-dd04a702516a\") " Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.887578 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.887611 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.887627 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.888402 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.888662 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.888710 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.888909 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.889349 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.889972 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.893565 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.893894 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8" (OuterVolumeSpecName: "kube-api-access-ndpk8") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "kube-api-access-ndpk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.895924 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "f3df01f1-f02c-45da-b645-dd04a702516a" (UID: "f3df01f1-f02c-45da-b645-dd04a702516a"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988598 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988649 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988661 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/f3df01f1-f02c-45da-b645-dd04a702516a-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988675 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988687 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988698 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988710 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988721 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988729 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f3df01f1-f02c-45da-b645-dd04a702516a-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988736 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndpk8\" (UniqueName: \"kubernetes.io/projected/f3df01f1-f02c-45da-b645-dd04a702516a-kube-api-access-ndpk8\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988744 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f3df01f1-f02c-45da-b645-dd04a702516a-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:33 crc kubenswrapper[4911]: I1201 00:34:33.988753 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f3df01f1-f02c-45da-b645-dd04a702516a-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:34 crc kubenswrapper[4911]: I1201 00:34:34.420838 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_f3df01f1-f02c-45da-b645-dd04a702516a/docker-build/0.log" Dec 01 00:34:34 crc kubenswrapper[4911]: I1201 00:34:34.421416 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"f3df01f1-f02c-45da-b645-dd04a702516a","Type":"ContainerDied","Data":"8d207c7f6ec32415785c7467b1070293b6ae27f37d30017dc2130f09c29e0064"} Dec 01 00:34:34 crc kubenswrapper[4911]: I1201 00:34:34.421498 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d207c7f6ec32415785c7467b1070293b6ae27f37d30017dc2130f09c29e0064" Dec 01 00:34:34 crc kubenswrapper[4911]: I1201 00:34:34.421519 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Dec 01 00:34:37 crc kubenswrapper[4911]: I1201 00:34:37.151796 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:34:37 crc kubenswrapper[4911]: E1201 00:34:37.152700 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:34:40 crc kubenswrapper[4911]: I1201 00:34:40.400206 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Dec 01 00:34:40 crc kubenswrapper[4911]: I1201 00:34:40.404925 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.521079 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:41 crc kubenswrapper[4911]: E1201 00:34:41.521967 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" containerName="manage-dockerfile" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.521986 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" containerName="manage-dockerfile" Dec 01 00:34:41 crc kubenswrapper[4911]: E1201 00:34:41.522014 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" containerName="docker-build" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.522023 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" containerName="docker-build" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.522219 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" containerName="docker-build" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.523395 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.541237 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.597905 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.598042 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.598144 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-752lk\" (UniqueName: \"kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.698966 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-752lk\" (UniqueName: \"kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.699041 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.699085 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.699776 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.699819 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.722727 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-752lk\" (UniqueName: \"kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk\") pod \"community-operators-hlp8d\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:41 crc kubenswrapper[4911]: I1201 00:34:41.846554 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.032295 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.033979 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.038185 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.038427 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.038814 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.038973 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.069288 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.139544 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.159948 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3df01f1-f02c-45da-b645-dd04a702516a" path="/var/lib/kubelet/pods/f3df01f1-f02c-45da-b645-dd04a702516a/volumes" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.208965 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209011 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209035 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209055 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209147 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209191 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209221 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dvbs\" (UniqueName: \"kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209250 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209270 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209293 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209364 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.209387 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310020 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310070 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310095 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310139 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310167 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310227 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310251 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310277 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310329 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310356 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310384 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dvbs\" (UniqueName: \"kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310727 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310777 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310899 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.310673 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.311049 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.311079 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.311541 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.311747 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.311934 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.318307 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.319755 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.355280 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dvbs\" (UniqueName: \"kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.491624 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerStarted","Data":"3f8505870683c9c12f3e45eb4b0bd2e5b9a7c79e1df71f643abea8a7529019a1"} Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.647774 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:42 crc kubenswrapper[4911]: I1201 00:34:42.878090 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Dec 01 00:34:43 crc kubenswrapper[4911]: I1201 00:34:43.501765 4911 generic.go:334] "Generic (PLEG): container finished" podID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerID="9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf" exitCode=0 Dec 01 00:34:43 crc kubenswrapper[4911]: I1201 00:34:43.502600 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerDied","Data":"9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf"} Dec 01 00:34:43 crc kubenswrapper[4911]: I1201 00:34:43.505226 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerStarted","Data":"3e4654f2af26151a0c24ad737e0f75b24ffba9df38e355e165902816ade7dd04"} Dec 01 00:34:43 crc kubenswrapper[4911]: I1201 00:34:43.505278 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerStarted","Data":"a1a7e3ec4efab6fb9f9fae3952f6bf5f88cfdc858c5e28493807deefcc1ba058"} Dec 01 00:34:44 crc kubenswrapper[4911]: I1201 00:34:44.521929 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerStarted","Data":"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac"} Dec 01 00:34:45 crc kubenswrapper[4911]: I1201 00:34:45.530950 4911 generic.go:334] "Generic (PLEG): container finished" podID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerID="a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac" exitCode=0 Dec 01 00:34:45 crc kubenswrapper[4911]: I1201 00:34:45.531176 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerDied","Data":"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac"} Dec 01 00:34:45 crc kubenswrapper[4911]: I1201 00:34:45.533734 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerID="3e4654f2af26151a0c24ad737e0f75b24ffba9df38e355e165902816ade7dd04" exitCode=0 Dec 01 00:34:45 crc kubenswrapper[4911]: I1201 00:34:45.533834 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerDied","Data":"3e4654f2af26151a0c24ad737e0f75b24ffba9df38e355e165902816ade7dd04"} Dec 01 00:34:46 crc kubenswrapper[4911]: I1201 00:34:46.552781 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerID="866e73e0476e37776864ced6911aaa8980f265b649b3c51854084748bff9f3f3" exitCode=0 Dec 01 00:34:46 crc kubenswrapper[4911]: I1201 00:34:46.552859 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerDied","Data":"866e73e0476e37776864ced6911aaa8980f265b649b3c51854084748bff9f3f3"} Dec 01 00:34:46 crc kubenswrapper[4911]: I1201 00:34:46.558007 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerStarted","Data":"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3"} Dec 01 00:34:46 crc kubenswrapper[4911]: I1201 00:34:46.616406 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_e5712053-331b-4321-a2ea-e71e1fafcd03/manage-dockerfile/0.log" Dec 01 00:34:46 crc kubenswrapper[4911]: I1201 00:34:46.632216 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlp8d" podStartSLOduration=3.161014464 podStartE2EDuration="5.632200656s" podCreationTimestamp="2025-12-01 00:34:41 +0000 UTC" firstStartedPulling="2025-12-01 00:34:43.504060797 +0000 UTC m=+1643.642757608" lastFinishedPulling="2025-12-01 00:34:45.975247019 +0000 UTC m=+1646.113943800" observedRunningTime="2025-12-01 00:34:46.626217738 +0000 UTC m=+1646.764914509" watchObservedRunningTime="2025-12-01 00:34:46.632200656 +0000 UTC m=+1646.770897427" Dec 01 00:34:47 crc kubenswrapper[4911]: I1201 00:34:47.566942 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerStarted","Data":"0a13b14b4bc2bf6ef692524cc82840d1a66892003f5c8320827085620358c4ec"} Dec 01 00:34:47 crc kubenswrapper[4911]: I1201 00:34:47.598193 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.59817191 podStartE2EDuration="5.59817191s" podCreationTimestamp="2025-12-01 00:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:34:47.594948419 +0000 UTC m=+1647.733645190" watchObservedRunningTime="2025-12-01 00:34:47.59817191 +0000 UTC m=+1647.736868671" Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.151808 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:34:51 crc kubenswrapper[4911]: E1201 00:34:51.153078 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.599798 4911 generic.go:334] "Generic (PLEG): container finished" podID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerID="0a13b14b4bc2bf6ef692524cc82840d1a66892003f5c8320827085620358c4ec" exitCode=0 Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.599869 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerDied","Data":"0a13b14b4bc2bf6ef692524cc82840d1a66892003f5c8320827085620358c4ec"} Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.847311 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.847418 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:51 crc kubenswrapper[4911]: I1201 00:34:51.904430 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:52 crc kubenswrapper[4911]: I1201 00:34:52.678986 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:52 crc kubenswrapper[4911]: I1201 00:34:52.738500 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:52 crc kubenswrapper[4911]: I1201 00:34:52.925969 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.075494 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.075784 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dvbs\" (UniqueName: \"kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.075925 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076514 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076656 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076755 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076867 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076447 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076612 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.076655 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.077272 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.078600 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.078676 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.078709 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.078747 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs\") pod \"e5712053-331b-4321-a2ea-e71e1fafcd03\" (UID: \"e5712053-331b-4321-a2ea-e71e1fafcd03\") " Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.077413 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.078863 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079039 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079157 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079270 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079343 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e5712053-331b-4321-a2ea-e71e1fafcd03-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079400 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079488 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.079287 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.080651 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.082189 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.091115 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.093279 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs" (OuterVolumeSpecName: "kube-api-access-8dvbs") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "kube-api-access-8dvbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.094080 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "e5712053-331b-4321-a2ea-e71e1fafcd03" (UID: "e5712053-331b-4321-a2ea-e71e1fafcd03"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.180138 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dvbs\" (UniqueName: \"kubernetes.io/projected/e5712053-331b-4321-a2ea-e71e1fafcd03-kube-api-access-8dvbs\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.180882 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.181014 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/e5712053-331b-4321-a2ea-e71e1fafcd03-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.181164 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.181326 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.181517 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e5712053-331b-4321-a2ea-e71e1fafcd03-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.181656 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e5712053-331b-4321-a2ea-e71e1fafcd03-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.616406 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"e5712053-331b-4321-a2ea-e71e1fafcd03","Type":"ContainerDied","Data":"a1a7e3ec4efab6fb9f9fae3952f6bf5f88cfdc858c5e28493807deefcc1ba058"} Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.616416 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Dec 01 00:34:53 crc kubenswrapper[4911]: I1201 00:34:53.616487 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1a7e3ec4efab6fb9f9fae3952f6bf5f88cfdc858c5e28493807deefcc1ba058" Dec 01 00:34:54 crc kubenswrapper[4911]: I1201 00:34:54.625232 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlp8d" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="registry-server" containerID="cri-o://361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3" gracePeriod=2 Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.050439 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.103253 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities\") pod \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.103308 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-752lk\" (UniqueName: \"kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk\") pod \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.103378 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content\") pod \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\" (UID: \"05a322b2-5a1b-4cdd-879d-5f55a25baddf\") " Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.104906 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities" (OuterVolumeSpecName: "utilities") pod "05a322b2-5a1b-4cdd-879d-5f55a25baddf" (UID: "05a322b2-5a1b-4cdd-879d-5f55a25baddf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.108088 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk" (OuterVolumeSpecName: "kube-api-access-752lk") pod "05a322b2-5a1b-4cdd-879d-5f55a25baddf" (UID: "05a322b2-5a1b-4cdd-879d-5f55a25baddf"). InnerVolumeSpecName "kube-api-access-752lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.153489 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05a322b2-5a1b-4cdd-879d-5f55a25baddf" (UID: "05a322b2-5a1b-4cdd-879d-5f55a25baddf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.205142 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.205172 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-752lk\" (UniqueName: \"kubernetes.io/projected/05a322b2-5a1b-4cdd-879d-5f55a25baddf-kube-api-access-752lk\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.205183 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05a322b2-5a1b-4cdd-879d-5f55a25baddf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.636229 4911 generic.go:334] "Generic (PLEG): container finished" podID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerID="361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3" exitCode=0 Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.636274 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlp8d" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.636290 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerDied","Data":"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3"} Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.636327 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlp8d" event={"ID":"05a322b2-5a1b-4cdd-879d-5f55a25baddf","Type":"ContainerDied","Data":"3f8505870683c9c12f3e45eb4b0bd2e5b9a7c79e1df71f643abea8a7529019a1"} Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.636356 4911 scope.go:117] "RemoveContainer" containerID="361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.662153 4911 scope.go:117] "RemoveContainer" containerID="a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.664228 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.669268 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlp8d"] Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.681315 4911 scope.go:117] "RemoveContainer" containerID="9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.705544 4911 scope.go:117] "RemoveContainer" containerID="361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3" Dec 01 00:34:55 crc kubenswrapper[4911]: E1201 00:34:55.706646 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3\": container with ID starting with 361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3 not found: ID does not exist" containerID="361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.706697 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3"} err="failed to get container status \"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3\": rpc error: code = NotFound desc = could not find container \"361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3\": container with ID starting with 361cae02a1f6faf15e672faa5e318aa4a38535eceb2e1875569a0538d39d31c3 not found: ID does not exist" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.706745 4911 scope.go:117] "RemoveContainer" containerID="a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac" Dec 01 00:34:55 crc kubenswrapper[4911]: E1201 00:34:55.707771 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac\": container with ID starting with a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac not found: ID does not exist" containerID="a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.707806 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac"} err="failed to get container status \"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac\": rpc error: code = NotFound desc = could not find container \"a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac\": container with ID starting with a8f98c463b0a8862b66fd35a4c2b2de530d6888b404778a4bacb4caa9fa106ac not found: ID does not exist" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.707831 4911 scope.go:117] "RemoveContainer" containerID="9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf" Dec 01 00:34:55 crc kubenswrapper[4911]: E1201 00:34:55.708514 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf\": container with ID starting with 9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf not found: ID does not exist" containerID="9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf" Dec 01 00:34:55 crc kubenswrapper[4911]: I1201 00:34:55.708535 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf"} err="failed to get container status \"9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf\": rpc error: code = NotFound desc = could not find container \"9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf\": container with ID starting with 9060c79e554f52048d6c1ffa0347741f85376a4b4071e338c23b0baa2f1867cf not found: ID does not exist" Dec 01 00:34:56 crc kubenswrapper[4911]: I1201 00:34:56.159492 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" path="/var/lib/kubelet/pods/05a322b2-5a1b-4cdd-879d-5f55a25baddf/volumes" Dec 01 00:35:05 crc kubenswrapper[4911]: I1201 00:35:05.151709 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:35:05 crc kubenswrapper[4911]: E1201 00:35:05.152515 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.545380 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546669 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="extract-utilities" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546693 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="extract-utilities" Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546709 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="docker-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546720 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="docker-build" Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546737 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="git-clone" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546749 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="git-clone" Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546762 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="manage-dockerfile" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546774 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="manage-dockerfile" Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546792 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="extract-content" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546802 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="extract-content" Dec 01 00:35:09 crc kubenswrapper[4911]: E1201 00:35:09.546820 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="registry-server" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546831 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="registry-server" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.546998 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a322b2-5a1b-4cdd-879d-5f55a25baddf" containerName="registry-server" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.547019 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5712053-331b-4321-a2ea-e71e1fafcd03" containerName="docker-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.549265 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613565 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613629 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613654 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613686 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5q4\" (UniqueName: \"kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613733 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613758 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613797 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613846 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.613960 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.614039 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.614075 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.614098 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.614144 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.615356 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.615757 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.615982 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.616014 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.616135 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-d6bvw" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.645076 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715401 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715484 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715504 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715527 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5q4\" (UniqueName: \"kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715571 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715592 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715623 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715658 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715679 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715701 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715700 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.715721 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716116 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716146 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716152 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716189 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716188 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716287 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716391 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716732 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.716768 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.717692 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.721209 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.721390 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.732920 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.733019 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5q4\" (UniqueName: \"kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4\") pod \"service-telemetry-framework-index-1-build\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:09 crc kubenswrapper[4911]: I1201 00:35:09.940836 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:10 crc kubenswrapper[4911]: I1201 00:35:10.465213 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Dec 01 00:35:10 crc kubenswrapper[4911]: I1201 00:35:10.733688 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerStarted","Data":"663ea7b69c0e10c84fc1dcba4a6f352a28954d90261abcb662a4720951c4d460"} Dec 01 00:35:11 crc kubenswrapper[4911]: I1201 00:35:11.743253 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerStarted","Data":"4d4b369988f3ed33ea57ffc2cc7e7c1ce8ff25f467d06deab71b37d1a6771ead"} Dec 01 00:35:12 crc kubenswrapper[4911]: I1201 00:35:12.752592 4911 generic.go:334] "Generic (PLEG): container finished" podID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerID="4d4b369988f3ed33ea57ffc2cc7e7c1ce8ff25f467d06deab71b37d1a6771ead" exitCode=0 Dec 01 00:35:12 crc kubenswrapper[4911]: I1201 00:35:12.752663 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerDied","Data":"4d4b369988f3ed33ea57ffc2cc7e7c1ce8ff25f467d06deab71b37d1a6771ead"} Dec 01 00:35:13 crc kubenswrapper[4911]: I1201 00:35:13.760194 4911 generic.go:334] "Generic (PLEG): container finished" podID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerID="40224eb1c07a34f06228a81135b5e5c9a28388cdb1d79c422b28be880839cadd" exitCode=0 Dec 01 00:35:13 crc kubenswrapper[4911]: I1201 00:35:13.760237 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerDied","Data":"40224eb1c07a34f06228a81135b5e5c9a28388cdb1d79c422b28be880839cadd"} Dec 01 00:35:13 crc kubenswrapper[4911]: I1201 00:35:13.800526 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_abb86b90-97e0-4757-90b9-1fd5c9d4a3b4/manage-dockerfile/0.log" Dec 01 00:35:14 crc kubenswrapper[4911]: I1201 00:35:14.770178 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerStarted","Data":"cf1c19d0806832cce3ce459f272b8228a3b1a340fa5338ea0a670acfe6e3b5d1"} Dec 01 00:35:19 crc kubenswrapper[4911]: I1201 00:35:19.152390 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:35:19 crc kubenswrapper[4911]: E1201 00:35:19.153611 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:35:34 crc kubenswrapper[4911]: I1201 00:35:34.152038 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:35:34 crc kubenswrapper[4911]: E1201 00:35:34.152811 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:35:45 crc kubenswrapper[4911]: I1201 00:35:45.990587 4911 generic.go:334] "Generic (PLEG): container finished" podID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerID="cf1c19d0806832cce3ce459f272b8228a3b1a340fa5338ea0a670acfe6e3b5d1" exitCode=0 Dec 01 00:35:45 crc kubenswrapper[4911]: I1201 00:35:45.990630 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerDied","Data":"cf1c19d0806832cce3ce459f272b8228a3b1a340fa5338ea0a670acfe6e3b5d1"} Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.290849 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428054 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428283 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428374 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428519 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428640 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428723 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428834 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428961 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.428994 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429071 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429112 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429168 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429228 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt5q4\" (UniqueName: \"kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429262 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache\") pod \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\" (UID: \"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4\") " Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429357 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429404 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429664 4911 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429687 4911 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildcachedir\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429707 4911 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-system-configs\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.429839 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.430179 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.430609 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.430927 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.435383 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4" (OuterVolumeSpecName: "kube-api-access-mt5q4") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "kube-api-access-mt5q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.435757 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.436325 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-push") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "builder-dockercfg-d6bvw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.436810 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull" (OuterVolumeSpecName: "builder-dockercfg-d6bvw-pull") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "builder-dockercfg-d6bvw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531187 4911 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-buildworkdir\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531222 4911 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531232 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-pull\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-pull\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531242 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-run\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531251 4911 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-d6bvw-push\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-builder-dockercfg-d6bvw-push\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531259 4911 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531267 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt5q4\" (UniqueName: \"kubernetes.io/projected/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-kube-api-access-mt5q4\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.531276 4911 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.662280 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:35:47 crc kubenswrapper[4911]: I1201 00:35:47.733750 4911 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:48 crc kubenswrapper[4911]: I1201 00:35:48.010669 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"abb86b90-97e0-4757-90b9-1fd5c9d4a3b4","Type":"ContainerDied","Data":"663ea7b69c0e10c84fc1dcba4a6f352a28954d90261abcb662a4720951c4d460"} Dec 01 00:35:48 crc kubenswrapper[4911]: I1201 00:35:48.010709 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="663ea7b69c0e10c84fc1dcba4a6f352a28954d90261abcb662a4720951c4d460" Dec 01 00:35:48 crc kubenswrapper[4911]: I1201 00:35:48.010776 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Dec 01 00:35:48 crc kubenswrapper[4911]: I1201 00:35:48.533811 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" (UID: "abb86b90-97e0-4757-90b9-1fd5c9d4a3b4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:35:48 crc kubenswrapper[4911]: I1201 00:35:48.543154 4911 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/abb86b90-97e0-4757-90b9-1fd5c9d4a3b4-container-storage-root\") on node \"crc\" DevicePath \"\"" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.151830 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:35:49 crc kubenswrapper[4911]: E1201 00:35:49.152221 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.364777 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:35:49 crc kubenswrapper[4911]: E1201 00:35:49.365650 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="git-clone" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.365674 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="git-clone" Dec 01 00:35:49 crc kubenswrapper[4911]: E1201 00:35:49.365693 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="docker-build" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.365705 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="docker-build" Dec 01 00:35:49 crc kubenswrapper[4911]: E1201 00:35:49.365735 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="manage-dockerfile" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.365748 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="manage-dockerfile" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.365924 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb86b90-97e0-4757-90b9-1fd5c9d4a3b4" containerName="docker-build" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.368042 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.370655 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-operators-dockercfg-w9zm2" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.377378 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.557530 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmdx\" (UniqueName: \"kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx\") pod \"service-telemetry-framework-operators-j9bfk\" (UID: \"54f859c2-c6fe-4905-8fef-4180dacdaf69\") " pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.659155 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmdx\" (UniqueName: \"kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx\") pod \"service-telemetry-framework-operators-j9bfk\" (UID: \"54f859c2-c6fe-4905-8fef-4180dacdaf69\") " pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.690360 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmdx\" (UniqueName: \"kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx\") pod \"service-telemetry-framework-operators-j9bfk\" (UID: \"54f859c2-c6fe-4905-8fef-4180dacdaf69\") " pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:35:49 crc kubenswrapper[4911]: I1201 00:35:49.987407 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:35:50 crc kubenswrapper[4911]: I1201 00:35:50.222325 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:35:51 crc kubenswrapper[4911]: I1201 00:35:51.033831 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" event={"ID":"54f859c2-c6fe-4905-8fef-4180dacdaf69","Type":"ContainerStarted","Data":"c2a24293d61c9826d41c008467a52d40583fe4ccd3a54bdde5ebe4183195b64d"} Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.159388 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.362617 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-vgxzh"] Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.366753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.374083 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-vgxzh"] Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.395288 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktfw\" (UniqueName: \"kubernetes.io/projected/5fbd0947-911e-4bd8-88d7-00daeea1d0d1-kube-api-access-dktfw\") pod \"service-telemetry-framework-operators-vgxzh\" (UID: \"5fbd0947-911e-4bd8-88d7-00daeea1d0d1\") " pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.497453 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktfw\" (UniqueName: \"kubernetes.io/projected/5fbd0947-911e-4bd8-88d7-00daeea1d0d1-kube-api-access-dktfw\") pod \"service-telemetry-framework-operators-vgxzh\" (UID: \"5fbd0947-911e-4bd8-88d7-00daeea1d0d1\") " pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.517683 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktfw\" (UniqueName: \"kubernetes.io/projected/5fbd0947-911e-4bd8-88d7-00daeea1d0d1-kube-api-access-dktfw\") pod \"service-telemetry-framework-operators-vgxzh\" (UID: \"5fbd0947-911e-4bd8-88d7-00daeea1d0d1\") " pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.687671 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:35:52 crc kubenswrapper[4911]: I1201 00:35:52.923424 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-vgxzh"] Dec 01 00:35:53 crc kubenswrapper[4911]: I1201 00:35:53.056476 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" event={"ID":"5fbd0947-911e-4bd8-88d7-00daeea1d0d1","Type":"ContainerStarted","Data":"fab09d5312392cb6d95580163290d960fe873fa7b1b19c33d90971daa7d1322d"} Dec 01 00:36:00 crc kubenswrapper[4911]: I1201 00:36:00.168708 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:36:00 crc kubenswrapper[4911]: E1201 00:36:00.169737 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.149931 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" event={"ID":"5fbd0947-911e-4bd8-88d7-00daeea1d0d1","Type":"ContainerStarted","Data":"64d7eb1fe7d8bec2c4fb79c1b3a8b2193fb08b2fe9e0705c37f800de0ff0d58e"} Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.153424 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" event={"ID":"54f859c2-c6fe-4905-8fef-4180dacdaf69","Type":"ContainerStarted","Data":"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4"} Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.153642 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" podUID="54f859c2-c6fe-4905-8fef-4180dacdaf69" containerName="registry-server" containerID="cri-o://7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4" gracePeriod=2 Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.177884 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" podStartSLOduration=1.440147567 podStartE2EDuration="13.177858258s" podCreationTimestamp="2025-12-01 00:35:52 +0000 UTC" firstStartedPulling="2025-12-01 00:35:52.936738791 +0000 UTC m=+1713.075435562" lastFinishedPulling="2025-12-01 00:36:04.674449482 +0000 UTC m=+1724.813146253" observedRunningTime="2025-12-01 00:36:05.164027619 +0000 UTC m=+1725.302724400" watchObservedRunningTime="2025-12-01 00:36:05.177858258 +0000 UTC m=+1725.316555029" Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.186112 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" podStartSLOduration=1.774612386 podStartE2EDuration="16.18609038s" podCreationTimestamp="2025-12-01 00:35:49 +0000 UTC" firstStartedPulling="2025-12-01 00:35:50.230264127 +0000 UTC m=+1710.368960899" lastFinishedPulling="2025-12-01 00:36:04.641742112 +0000 UTC m=+1724.780438893" observedRunningTime="2025-12-01 00:36:05.183420995 +0000 UTC m=+1725.322117786" watchObservedRunningTime="2025-12-01 00:36:05.18609038 +0000 UTC m=+1725.324787161" Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.591473 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.691750 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmdx\" (UniqueName: \"kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx\") pod \"54f859c2-c6fe-4905-8fef-4180dacdaf69\" (UID: \"54f859c2-c6fe-4905-8fef-4180dacdaf69\") " Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.696418 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx" (OuterVolumeSpecName: "kube-api-access-gtmdx") pod "54f859c2-c6fe-4905-8fef-4180dacdaf69" (UID: "54f859c2-c6fe-4905-8fef-4180dacdaf69"). InnerVolumeSpecName "kube-api-access-gtmdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:36:05 crc kubenswrapper[4911]: I1201 00:36:05.793041 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmdx\" (UniqueName: \"kubernetes.io/projected/54f859c2-c6fe-4905-8fef-4180dacdaf69-kube-api-access-gtmdx\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.165803 4911 generic.go:334] "Generic (PLEG): container finished" podID="54f859c2-c6fe-4905-8fef-4180dacdaf69" containerID="7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4" exitCode=0 Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.165843 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.165856 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" event={"ID":"54f859c2-c6fe-4905-8fef-4180dacdaf69","Type":"ContainerDied","Data":"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4"} Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.166266 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-j9bfk" event={"ID":"54f859c2-c6fe-4905-8fef-4180dacdaf69","Type":"ContainerDied","Data":"c2a24293d61c9826d41c008467a52d40583fe4ccd3a54bdde5ebe4183195b64d"} Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.166308 4911 scope.go:117] "RemoveContainer" containerID="7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4" Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.189743 4911 scope.go:117] "RemoveContainer" containerID="7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4" Dec 01 00:36:06 crc kubenswrapper[4911]: E1201 00:36:06.190629 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4\": container with ID starting with 7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4 not found: ID does not exist" containerID="7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4" Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.190678 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4"} err="failed to get container status \"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4\": rpc error: code = NotFound desc = could not find container \"7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4\": container with ID starting with 7d3d06034ed0e4408d94428f622ec07fc5550b2d8edca02a5f1927dde9adbda4 not found: ID does not exist" Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.206704 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:36:06 crc kubenswrapper[4911]: I1201 00:36:06.211486 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-j9bfk"] Dec 01 00:36:08 crc kubenswrapper[4911]: I1201 00:36:08.158136 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f859c2-c6fe-4905-8fef-4180dacdaf69" path="/var/lib/kubelet/pods/54f859c2-c6fe-4905-8fef-4180dacdaf69/volumes" Dec 01 00:36:12 crc kubenswrapper[4911]: I1201 00:36:12.688275 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:36:12 crc kubenswrapper[4911]: I1201 00:36:12.688648 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:36:12 crc kubenswrapper[4911]: I1201 00:36:12.724688 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:36:13 crc kubenswrapper[4911]: I1201 00:36:13.255084 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-vgxzh" Dec 01 00:36:15 crc kubenswrapper[4911]: I1201 00:36:15.151759 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:36:15 crc kubenswrapper[4911]: E1201 00:36:15.152004 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.422489 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx"] Dec 01 00:36:24 crc kubenswrapper[4911]: E1201 00:36:24.423321 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f859c2-c6fe-4905-8fef-4180dacdaf69" containerName="registry-server" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.423336 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f859c2-c6fe-4905-8fef-4180dacdaf69" containerName="registry-server" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.423522 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f859c2-c6fe-4905-8fef-4180dacdaf69" containerName="registry-server" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.424793 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.437387 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx"] Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.563197 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.563270 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6v7z\" (UniqueName: \"kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.563390 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.664856 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.664918 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.664947 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6v7z\" (UniqueName: \"kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.665504 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.665520 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.684141 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6v7z\" (UniqueName: \"kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.740873 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:24 crc kubenswrapper[4911]: I1201 00:36:24.963531 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx"] Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.205847 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb"] Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.207239 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.230743 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb"] Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.272721 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.272807 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.272867 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: E1201 00:36:25.301757 4911 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5125ad40_494f_4d32_b157_67836cd1d2c1.slice/crio-conmon-0f92abf861542c1452c27bf9dddd3a35199b703861ab5199ff31ce23aaddaf94.scope\": RecentStats: unable to find data in memory cache]" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.306718 4911 generic.go:334] "Generic (PLEG): container finished" podID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerID="0f92abf861542c1452c27bf9dddd3a35199b703861ab5199ff31ce23aaddaf94" exitCode=0 Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.306755 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" event={"ID":"5125ad40-494f-4d32-b157-67836cd1d2c1","Type":"ContainerDied","Data":"0f92abf861542c1452c27bf9dddd3a35199b703861ab5199ff31ce23aaddaf94"} Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.306781 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" event={"ID":"5125ad40-494f-4d32-b157-67836cd1d2c1","Type":"ContainerStarted","Data":"715cabdc6666c7feb695cd5d49f4fc7598983834cd10d422f4e561b1244fcddc"} Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.308642 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.373844 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.373917 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.373963 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.374480 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.374560 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.393537 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.615426 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:25 crc kubenswrapper[4911]: I1201 00:36:25.894198 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb"] Dec 01 00:36:25 crc kubenswrapper[4911]: W1201 00:36:25.928757 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25f680d_0e88_4e21_8017_c69353eb4849.slice/crio-687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c WatchSource:0}: Error finding container 687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c: Status 404 returned error can't find the container with id 687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c Dec 01 00:36:26 crc kubenswrapper[4911]: I1201 00:36:26.315881 4911 generic.go:334] "Generic (PLEG): container finished" podID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerID="1c2cf87fb1ee8c82792bb6a523a043102276fed7e9312af7775dc3e6f3f678ef" exitCode=0 Dec 01 00:36:26 crc kubenswrapper[4911]: I1201 00:36:26.316001 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" event={"ID":"5125ad40-494f-4d32-b157-67836cd1d2c1","Type":"ContainerDied","Data":"1c2cf87fb1ee8c82792bb6a523a043102276fed7e9312af7775dc3e6f3f678ef"} Dec 01 00:36:26 crc kubenswrapper[4911]: I1201 00:36:26.320126 4911 generic.go:334] "Generic (PLEG): container finished" podID="c25f680d-0e88-4e21-8017-c69353eb4849" containerID="85530eaaf6a23165594c99710e1666e1d4ac2b02ee981abac0be260ab5fdffee" exitCode=0 Dec 01 00:36:26 crc kubenswrapper[4911]: I1201 00:36:26.320207 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" event={"ID":"c25f680d-0e88-4e21-8017-c69353eb4849","Type":"ContainerDied","Data":"85530eaaf6a23165594c99710e1666e1d4ac2b02ee981abac0be260ab5fdffee"} Dec 01 00:36:26 crc kubenswrapper[4911]: I1201 00:36:26.320258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" event={"ID":"c25f680d-0e88-4e21-8017-c69353eb4849","Type":"ContainerStarted","Data":"687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c"} Dec 01 00:36:27 crc kubenswrapper[4911]: I1201 00:36:27.328531 4911 generic.go:334] "Generic (PLEG): container finished" podID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerID="bb9803cf52bb8ac6121264dfbdb40e76ebf912f1bf61ff24c108f54ee5318760" exitCode=0 Dec 01 00:36:27 crc kubenswrapper[4911]: I1201 00:36:27.328589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" event={"ID":"5125ad40-494f-4d32-b157-67836cd1d2c1","Type":"ContainerDied","Data":"bb9803cf52bb8ac6121264dfbdb40e76ebf912f1bf61ff24c108f54ee5318760"} Dec 01 00:36:27 crc kubenswrapper[4911]: I1201 00:36:27.331218 4911 generic.go:334] "Generic (PLEG): container finished" podID="c25f680d-0e88-4e21-8017-c69353eb4849" containerID="c015d9da813f36c067b81970be6cd51d8d5b8413e1b96005ca1b4156c560a945" exitCode=0 Dec 01 00:36:27 crc kubenswrapper[4911]: I1201 00:36:27.331267 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" event={"ID":"c25f680d-0e88-4e21-8017-c69353eb4849","Type":"ContainerDied","Data":"c015d9da813f36c067b81970be6cd51d8d5b8413e1b96005ca1b4156c560a945"} Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.343135 4911 generic.go:334] "Generic (PLEG): container finished" podID="c25f680d-0e88-4e21-8017-c69353eb4849" containerID="62e71d7e68665e8f9690ac99009073fdc408b911ec5f18e74651fe853b266820" exitCode=0 Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.343211 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" event={"ID":"c25f680d-0e88-4e21-8017-c69353eb4849","Type":"ContainerDied","Data":"62e71d7e68665e8f9690ac99009073fdc408b911ec5f18e74651fe853b266820"} Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.633096 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.729370 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle\") pod \"5125ad40-494f-4d32-b157-67836cd1d2c1\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.729434 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util\") pod \"5125ad40-494f-4d32-b157-67836cd1d2c1\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.729498 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6v7z\" (UniqueName: \"kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z\") pod \"5125ad40-494f-4d32-b157-67836cd1d2c1\" (UID: \"5125ad40-494f-4d32-b157-67836cd1d2c1\") " Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.731537 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle" (OuterVolumeSpecName: "bundle") pod "5125ad40-494f-4d32-b157-67836cd1d2c1" (UID: "5125ad40-494f-4d32-b157-67836cd1d2c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.739796 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z" (OuterVolumeSpecName: "kube-api-access-d6v7z") pod "5125ad40-494f-4d32-b157-67836cd1d2c1" (UID: "5125ad40-494f-4d32-b157-67836cd1d2c1"). InnerVolumeSpecName "kube-api-access-d6v7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.768560 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util" (OuterVolumeSpecName: "util") pod "5125ad40-494f-4d32-b157-67836cd1d2c1" (UID: "5125ad40-494f-4d32-b157-67836cd1d2c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.831371 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.831415 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5125ad40-494f-4d32-b157-67836cd1d2c1-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:28 crc kubenswrapper[4911]: I1201 00:36:28.831429 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6v7z\" (UniqueName: \"kubernetes.io/projected/5125ad40-494f-4d32-b157-67836cd1d2c1-kube-api-access-d6v7z\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.355578 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" event={"ID":"5125ad40-494f-4d32-b157-67836cd1d2c1","Type":"ContainerDied","Data":"715cabdc6666c7feb695cd5d49f4fc7598983834cd10d422f4e561b1244fcddc"} Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.355986 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715cabdc6666c7feb695cd5d49f4fc7598983834cd10d422f4e561b1244fcddc" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.355622 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a8n6jx" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.731523 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.846986 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5\") pod \"c25f680d-0e88-4e21-8017-c69353eb4849\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.847066 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util\") pod \"c25f680d-0e88-4e21-8017-c69353eb4849\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.847158 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle\") pod \"c25f680d-0e88-4e21-8017-c69353eb4849\" (UID: \"c25f680d-0e88-4e21-8017-c69353eb4849\") " Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.847826 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle" (OuterVolumeSpecName: "bundle") pod "c25f680d-0e88-4e21-8017-c69353eb4849" (UID: "c25f680d-0e88-4e21-8017-c69353eb4849"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.855451 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5" (OuterVolumeSpecName: "kube-api-access-ctwl5") pod "c25f680d-0e88-4e21-8017-c69353eb4849" (UID: "c25f680d-0e88-4e21-8017-c69353eb4849"). InnerVolumeSpecName "kube-api-access-ctwl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.861681 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util" (OuterVolumeSpecName: "util") pod "c25f680d-0e88-4e21-8017-c69353eb4849" (UID: "c25f680d-0e88-4e21-8017-c69353eb4849"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.948440 4911 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.948485 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctwl5\" (UniqueName: \"kubernetes.io/projected/c25f680d-0e88-4e21-8017-c69353eb4849-kube-api-access-ctwl5\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:29 crc kubenswrapper[4911]: I1201 00:36:29.948495 4911 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c25f680d-0e88-4e21-8017-c69353eb4849-util\") on node \"crc\" DevicePath \"\"" Dec 01 00:36:30 crc kubenswrapper[4911]: I1201 00:36:30.163422 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:36:30 crc kubenswrapper[4911]: E1201 00:36:30.163995 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:36:30 crc kubenswrapper[4911]: I1201 00:36:30.366412 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" event={"ID":"c25f680d-0e88-4e21-8017-c69353eb4849","Type":"ContainerDied","Data":"687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c"} Dec 01 00:36:30 crc kubenswrapper[4911]: I1201 00:36:30.366453 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687cd6e1e9b79dbb67748017d4a1b8945d1bc897cc1c3f8595103ad4600cc25c" Dec 01 00:36:30 crc kubenswrapper[4911]: I1201 00:36:30.366570 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09s7mnb" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.707376 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj"] Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708547 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="util" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708568 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="util" Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708580 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708588 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708600 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708607 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708628 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="pull" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708635 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="pull" Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708651 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="pull" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708658 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="pull" Dec 01 00:36:36 crc kubenswrapper[4911]: E1201 00:36:36.708671 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="util" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708677 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="util" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708802 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25f680d-0e88-4e21-8017-c69353eb4849" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.708824 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="5125ad40-494f-4d32-b157-67836cd1d2c1" containerName="extract" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.709350 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.711946 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-b8qpt" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.735885 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj"] Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.747263 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7514cd3a-f88e-48ee-9c94-7e6cd903201e-runner\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.747337 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cznl\" (UniqueName: \"kubernetes.io/projected/7514cd3a-f88e-48ee-9c94-7e6cd903201e-kube-api-access-9cznl\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.848952 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7514cd3a-f88e-48ee-9c94-7e6cd903201e-runner\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.849203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cznl\" (UniqueName: \"kubernetes.io/projected/7514cd3a-f88e-48ee-9c94-7e6cd903201e-kube-api-access-9cznl\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.849725 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/7514cd3a-f88e-48ee-9c94-7e6cd903201e-runner\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:36 crc kubenswrapper[4911]: I1201 00:36:36.870673 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cznl\" (UniqueName: \"kubernetes.io/projected/7514cd3a-f88e-48ee-9c94-7e6cd903201e-kube-api-access-9cznl\") pod \"service-telemetry-operator-5d44b4d989-2ktqj\" (UID: \"7514cd3a-f88e-48ee-9c94-7e6cd903201e\") " pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:37 crc kubenswrapper[4911]: I1201 00:36:37.032282 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" Dec 01 00:36:37 crc kubenswrapper[4911]: I1201 00:36:37.288850 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj"] Dec 01 00:36:37 crc kubenswrapper[4911]: I1201 00:36:37.413656 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" event={"ID":"7514cd3a-f88e-48ee-9c94-7e6cd903201e","Type":"ContainerStarted","Data":"6cdae6f88ccd26759c01272bf7094c37960064ed46f1be0621a3c409c0ba8592"} Dec 01 00:36:39 crc kubenswrapper[4911]: I1201 00:36:39.920234 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-6c444f9775-5nwgx"] Dec 01 00:36:39 crc kubenswrapper[4911]: I1201 00:36:39.921286 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:39 crc kubenswrapper[4911]: I1201 00:36:39.923405 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-x4bjr" Dec 01 00:36:39 crc kubenswrapper[4911]: I1201 00:36:39.993550 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6c444f9775-5nwgx"] Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.106190 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/f172be71-8395-4d1e-a2ce-8692cde84dd1-runner\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.106417 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fks6q\" (UniqueName: \"kubernetes.io/projected/f172be71-8395-4d1e-a2ce-8692cde84dd1-kube-api-access-fks6q\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.207566 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fks6q\" (UniqueName: \"kubernetes.io/projected/f172be71-8395-4d1e-a2ce-8692cde84dd1-kube-api-access-fks6q\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.207683 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/f172be71-8395-4d1e-a2ce-8692cde84dd1-runner\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.208226 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/f172be71-8395-4d1e-a2ce-8692cde84dd1-runner\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.226822 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fks6q\" (UniqueName: \"kubernetes.io/projected/f172be71-8395-4d1e-a2ce-8692cde84dd1-kube-api-access-fks6q\") pod \"smart-gateway-operator-6c444f9775-5nwgx\" (UID: \"f172be71-8395-4d1e-a2ce-8692cde84dd1\") " pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.242248 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" Dec 01 00:36:40 crc kubenswrapper[4911]: I1201 00:36:40.545013 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-6c444f9775-5nwgx"] Dec 01 00:36:41 crc kubenswrapper[4911]: I1201 00:36:41.445281 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" event={"ID":"f172be71-8395-4d1e-a2ce-8692cde84dd1","Type":"ContainerStarted","Data":"3be4f4d5fe28318a8a5f17752f5f9785e98bcff97ad80fb8543d1c8628434ed5"} Dec 01 00:36:44 crc kubenswrapper[4911]: I1201 00:36:44.151731 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:36:44 crc kubenswrapper[4911]: E1201 00:36:44.152230 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:36:57 crc kubenswrapper[4911]: I1201 00:36:57.154536 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:36:57 crc kubenswrapper[4911]: E1201 00:36:57.155389 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.033569 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.034371 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1764549235,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fks6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-6c444f9775-5nwgx_service-telemetry(f172be71-8395-4d1e-a2ce-8692cde84dd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.035507 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" podUID="f172be71-8395-4d1e-a2ce-8692cde84dd1" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.206734 4911 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/service-telemetry-operator:stable-1.5" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.206928 4911 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/service-telemetry-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:service-telemetry-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_WEBHOOK_SNMP_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/prometheus-webhook-snmp:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_IMAGE,Value:quay.io/prometheus/prometheus:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER_IMAGE,Value:quay.io/prometheus/alertmanager:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:service-telemetry-operator.v1.5.1764549237,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9cznl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-operator-5d44b4d989-2ktqj_service-telemetry(7514cd3a-f88e-48ee-9c94-7e6cd903201e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.208148 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" podUID="7514cd3a-f88e-48ee-9c94-7e6cd903201e" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.650871 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/service-telemetry-operator:stable-1.5\\\"\"" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" podUID="7514cd3a-f88e-48ee-9c94-7e6cd903201e" Dec 01 00:37:01 crc kubenswrapper[4911]: E1201 00:37:01.651130 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" podUID="f172be71-8395-4d1e-a2ce-8692cde84dd1" Dec 01 00:37:10 crc kubenswrapper[4911]: I1201 00:37:10.155279 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:37:10 crc kubenswrapper[4911]: E1201 00:37:10.155983 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:37:17 crc kubenswrapper[4911]: I1201 00:37:17.768883 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" event={"ID":"f172be71-8395-4d1e-a2ce-8692cde84dd1","Type":"ContainerStarted","Data":"806dced01b1537e67991264655ce2c7d02490b95eba935dadd90c8eebbd3f997"} Dec 01 00:37:17 crc kubenswrapper[4911]: I1201 00:37:17.821612 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-6c444f9775-5nwgx" podStartSLOduration=2.620263418 podStartE2EDuration="38.821595531s" podCreationTimestamp="2025-12-01 00:36:39 +0000 UTC" firstStartedPulling="2025-12-01 00:36:40.558262182 +0000 UTC m=+1760.696958953" lastFinishedPulling="2025-12-01 00:37:16.759594295 +0000 UTC m=+1796.898291066" observedRunningTime="2025-12-01 00:37:17.818096702 +0000 UTC m=+1797.956793493" watchObservedRunningTime="2025-12-01 00:37:17.821595531 +0000 UTC m=+1797.960292302" Dec 01 00:37:18 crc kubenswrapper[4911]: I1201 00:37:18.778043 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" event={"ID":"7514cd3a-f88e-48ee-9c94-7e6cd903201e","Type":"ContainerStarted","Data":"40408ccb5cc28d3ed509dd0eb83989d1b8a3c1bd447cfaedd028c339e6633a60"} Dec 01 00:37:25 crc kubenswrapper[4911]: I1201 00:37:25.151979 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:37:25 crc kubenswrapper[4911]: E1201 00:37:25.152854 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:37:36 crc kubenswrapper[4911]: I1201 00:37:36.152503 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:37:36 crc kubenswrapper[4911]: E1201 00:37:36.153269 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.685184 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-5d44b4d989-2ktqj" podStartSLOduration=26.180164174 podStartE2EDuration="1m6.68516365s" podCreationTimestamp="2025-12-01 00:36:36 +0000 UTC" firstStartedPulling="2025-12-01 00:36:37.300078059 +0000 UTC m=+1757.438774830" lastFinishedPulling="2025-12-01 00:37:17.805077535 +0000 UTC m=+1797.943774306" observedRunningTime="2025-12-01 00:37:18.798940861 +0000 UTC m=+1798.937637642" watchObservedRunningTime="2025-12-01 00:37:42.68516365 +0000 UTC m=+1822.823860431" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.688848 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.689800 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.696015 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.697790 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.697930 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.697982 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.697946 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.698247 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.700073 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-z7k2d" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.700198 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758387 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758431 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758474 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pxl\" (UniqueName: \"kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758498 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758518 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758541 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.758586 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860163 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860269 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860302 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860328 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pxl\" (UniqueName: \"kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860352 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860374 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.860403 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.861655 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.873414 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.874970 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.875365 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.875954 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.876635 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:42 crc kubenswrapper[4911]: I1201 00:37:42.877765 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pxl\" (UniqueName: \"kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl\") pod \"default-interconnect-68864d46cb-4tvsb\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:43 crc kubenswrapper[4911]: I1201 00:37:43.017295 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:37:43 crc kubenswrapper[4911]: I1201 00:37:43.456719 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:37:43 crc kubenswrapper[4911]: W1201 00:37:43.457801 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b3f0422_775e_4d08_a19c_9c9d0a5cfae8.slice/crio-1188909e38388071bb2da1c05ea0816cad5949d7bfff08a54ddf759bf6186682 WatchSource:0}: Error finding container 1188909e38388071bb2da1c05ea0816cad5949d7bfff08a54ddf759bf6186682: Status 404 returned error can't find the container with id 1188909e38388071bb2da1c05ea0816cad5949d7bfff08a54ddf759bf6186682 Dec 01 00:37:43 crc kubenswrapper[4911]: I1201 00:37:43.939090 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" event={"ID":"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8","Type":"ContainerStarted","Data":"1188909e38388071bb2da1c05ea0816cad5949d7bfff08a54ddf759bf6186682"} Dec 01 00:37:48 crc kubenswrapper[4911]: I1201 00:37:48.977996 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" event={"ID":"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8","Type":"ContainerStarted","Data":"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7"} Dec 01 00:37:48 crc kubenswrapper[4911]: I1201 00:37:48.996692 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" podStartSLOduration=2.3560731710000002 podStartE2EDuration="6.996672633s" podCreationTimestamp="2025-12-01 00:37:42 +0000 UTC" firstStartedPulling="2025-12-01 00:37:43.45955775 +0000 UTC m=+1823.598254531" lastFinishedPulling="2025-12-01 00:37:48.100157222 +0000 UTC m=+1828.238853993" observedRunningTime="2025-12-01 00:37:48.99265661 +0000 UTC m=+1829.131353391" watchObservedRunningTime="2025-12-01 00:37:48.996672633 +0000 UTC m=+1829.135369404" Dec 01 00:37:51 crc kubenswrapper[4911]: I1201 00:37:51.151867 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:37:51 crc kubenswrapper[4911]: E1201 00:37:51.152222 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.749400 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.750979 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.752782 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.754107 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.754219 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.754442 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.754531 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.756940 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.757101 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.762509 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-ls9fz" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.773987 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948088 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948285 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a840446-75d9-4c19-816b-1e0424759234-config-out\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948418 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-web-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948512 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-tls-assets\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948546 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948592 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948638 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948695 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:53 crc kubenswrapper[4911]: I1201 00:37:53.948781 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbdff\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-kube-api-access-nbdff\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050255 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050329 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050369 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a840446-75d9-4c19-816b-1e0424759234-config-out\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050396 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-web-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050415 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-tls-assets\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050434 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050474 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050506 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050526 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.050558 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbdff\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-kube-api-access-nbdff\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: E1201 00:37:54.051919 4911 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Dec 01 00:37:54 crc kubenswrapper[4911]: E1201 00:37:54.052008 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls podName:3a840446-75d9-4c19-816b-1e0424759234 nodeName:}" failed. No retries permitted until 2025-12-01 00:37:54.551983486 +0000 UTC m=+1834.690680337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "3a840446-75d9-4c19-816b-1e0424759234") : secret "default-prometheus-proxy-tls" not found Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.053257 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.053346 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a840446-75d9-4c19-816b-1e0424759234-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.056639 4911 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.056837 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/69c42c6da6e8fde80660f2053e386b9abd423e7f40639837c87004fef6b14c61/globalmount\"" pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.056770 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-tls-assets\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.056917 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-web-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.056758 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a840446-75d9-4c19-816b-1e0424759234-config-out\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.058226 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-config\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.060251 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.074244 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbdff\" (UniqueName: \"kubernetes.io/projected/3a840446-75d9-4c19-816b-1e0424759234-kube-api-access-nbdff\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.080711 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce04dd95-a2a8-447d-84a0-a7e9095acd43\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: I1201 00:37:54.560518 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:54 crc kubenswrapper[4911]: E1201 00:37:54.560943 4911 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Dec 01 00:37:54 crc kubenswrapper[4911]: E1201 00:37:54.561057 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls podName:3a840446-75d9-4c19-816b-1e0424759234 nodeName:}" failed. No retries permitted until 2025-12-01 00:37:55.561034145 +0000 UTC m=+1835.699730916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "3a840446-75d9-4c19-816b-1e0424759234") : secret "default-prometheus-proxy-tls" not found Dec 01 00:37:55 crc kubenswrapper[4911]: I1201 00:37:55.571915 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:55 crc kubenswrapper[4911]: I1201 00:37:55.576842 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a840446-75d9-4c19-816b-1e0424759234-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"3a840446-75d9-4c19-816b-1e0424759234\") " pod="service-telemetry/prometheus-default-0" Dec 01 00:37:55 crc kubenswrapper[4911]: I1201 00:37:55.593375 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Dec 01 00:37:56 crc kubenswrapper[4911]: I1201 00:37:56.119390 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Dec 01 00:37:56 crc kubenswrapper[4911]: W1201 00:37:56.122932 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a840446_75d9_4c19_816b_1e0424759234.slice/crio-bf19a20a1fdc6127d33db0c43a2c1d70ff238a85804a89721faf9d0a3f6d6d2d WatchSource:0}: Error finding container bf19a20a1fdc6127d33db0c43a2c1d70ff238a85804a89721faf9d0a3f6d6d2d: Status 404 returned error can't find the container with id bf19a20a1fdc6127d33db0c43a2c1d70ff238a85804a89721faf9d0a3f6d6d2d Dec 01 00:37:56 crc kubenswrapper[4911]: I1201 00:37:56.147233 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerStarted","Data":"bf19a20a1fdc6127d33db0c43a2c1d70ff238a85804a89721faf9d0a3f6d6d2d"} Dec 01 00:38:01 crc kubenswrapper[4911]: I1201 00:38:01.193396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerStarted","Data":"cb546b6aad7b2c47bd35f48efd111d4f13dbced272133d8b8bbe19dc9e4c2956"} Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.550273 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lmsf6"] Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.551073 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.561285 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lmsf6"] Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.723212 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qfw\" (UniqueName: \"kubernetes.io/projected/7c4346a2-51ae-4df1-b881-d6eac247e811-kube-api-access-75qfw\") pod \"default-snmp-webhook-6856cfb745-lmsf6\" (UID: \"7c4346a2-51ae-4df1-b881-d6eac247e811\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.825516 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qfw\" (UniqueName: \"kubernetes.io/projected/7c4346a2-51ae-4df1-b881-d6eac247e811-kube-api-access-75qfw\") pod \"default-snmp-webhook-6856cfb745-lmsf6\" (UID: \"7c4346a2-51ae-4df1-b881-d6eac247e811\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.848248 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qfw\" (UniqueName: \"kubernetes.io/projected/7c4346a2-51ae-4df1-b881-d6eac247e811-kube-api-access-75qfw\") pod \"default-snmp-webhook-6856cfb745-lmsf6\" (UID: \"7c4346a2-51ae-4df1-b881-d6eac247e811\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" Dec 01 00:38:03 crc kubenswrapper[4911]: I1201 00:38:03.868499 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" Dec 01 00:38:04 crc kubenswrapper[4911]: I1201 00:38:04.139765 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-lmsf6"] Dec 01 00:38:04 crc kubenswrapper[4911]: I1201 00:38:04.221133 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" event={"ID":"7c4346a2-51ae-4df1-b881-d6eac247e811","Type":"ContainerStarted","Data":"fa774ab3e77097021f207a3c461f19c7d8a094d6bb3c59b999acbf60b0ffd729"} Dec 01 00:38:05 crc kubenswrapper[4911]: I1201 00:38:05.151821 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:38:05 crc kubenswrapper[4911]: E1201 00:38:05.152091 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.540967 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.542888 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.545018 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.545038 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.545023 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-95cvd" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.545799 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.545830 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.546332 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.561327 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715075 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715155 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a5d7340-9706-4c29-95e4-eb116c175acc-config-out\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715180 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-config-volume\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715206 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715421 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-tls-assets\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715610 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-web-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715676 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715708 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzx7x\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-kube-api-access-kzx7x\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.715748 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817007 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a5d7340-9706-4c29-95e4-eb116c175acc-config-out\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817042 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-config-volume\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817064 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817102 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-tls-assets\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817138 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-web-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817166 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817182 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzx7x\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-kube-api-access-kzx7x\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817206 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.817252 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: E1201 00:38:07.817308 4911 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:07 crc kubenswrapper[4911]: E1201 00:38:07.817392 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls podName:9a5d7340-9706-4c29-95e4-eb116c175acc nodeName:}" failed. No retries permitted until 2025-12-01 00:38:08.317368144 +0000 UTC m=+1848.456064915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "9a5d7340-9706-4c29-95e4-eb116c175acc") : secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.823630 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.824023 4911 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.824067 4911 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/637b70d6fb0e6f44b033d326880d10db3d706b7c2768839a7abb71b327498981/globalmount\"" pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.824217 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-tls-assets\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.824357 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.824475 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9a5d7340-9706-4c29-95e4-eb116c175acc-config-out\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.825554 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-web-config\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.839866 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzx7x\" (UniqueName: \"kubernetes.io/projected/9a5d7340-9706-4c29-95e4-eb116c175acc-kube-api-access-kzx7x\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.848012 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-config-volume\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:07 crc kubenswrapper[4911]: I1201 00:38:07.856578 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e87734-fb4c-4d28-b9dc-bdf0310e4c43\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:08 crc kubenswrapper[4911]: I1201 00:38:08.323493 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:08 crc kubenswrapper[4911]: E1201 00:38:08.323695 4911 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:08 crc kubenswrapper[4911]: E1201 00:38:08.323775 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls podName:9a5d7340-9706-4c29-95e4-eb116c175acc nodeName:}" failed. No retries permitted until 2025-12-01 00:38:09.323757808 +0000 UTC m=+1849.462454579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "9a5d7340-9706-4c29-95e4-eb116c175acc") : secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:09 crc kubenswrapper[4911]: I1201 00:38:09.258299 4911 generic.go:334] "Generic (PLEG): container finished" podID="3a840446-75d9-4c19-816b-1e0424759234" containerID="cb546b6aad7b2c47bd35f48efd111d4f13dbced272133d8b8bbe19dc9e4c2956" exitCode=0 Dec 01 00:38:09 crc kubenswrapper[4911]: I1201 00:38:09.258402 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerDied","Data":"cb546b6aad7b2c47bd35f48efd111d4f13dbced272133d8b8bbe19dc9e4c2956"} Dec 01 00:38:09 crc kubenswrapper[4911]: I1201 00:38:09.339021 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:09 crc kubenswrapper[4911]: E1201 00:38:09.339201 4911 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:09 crc kubenswrapper[4911]: E1201 00:38:09.339269 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls podName:9a5d7340-9706-4c29-95e4-eb116c175acc nodeName:}" failed. No retries permitted until 2025-12-01 00:38:11.339253384 +0000 UTC m=+1851.477950155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "9a5d7340-9706-4c29-95e4-eb116c175acc") : secret "default-alertmanager-proxy-tls" not found Dec 01 00:38:11 crc kubenswrapper[4911]: I1201 00:38:11.366329 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:11 crc kubenswrapper[4911]: I1201 00:38:11.371850 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a5d7340-9706-4c29-95e4-eb116c175acc-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"9a5d7340-9706-4c29-95e4-eb116c175acc\") " pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:11 crc kubenswrapper[4911]: I1201 00:38:11.507979 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Dec 01 00:38:12 crc kubenswrapper[4911]: I1201 00:38:12.473515 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Dec 01 00:38:13 crc kubenswrapper[4911]: I1201 00:38:13.291753 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerStarted","Data":"02c89a9ff84e41fed9ed52d09acda970bf99c884a40e5c2a45740d76eb786128"} Dec 01 00:38:13 crc kubenswrapper[4911]: I1201 00:38:13.296627 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" event={"ID":"7c4346a2-51ae-4df1-b881-d6eac247e811","Type":"ContainerStarted","Data":"f594fc9b1095fa2d533dcf639ba44552d34b95ad1358ac79c6edb9452cca4c35"} Dec 01 00:38:13 crc kubenswrapper[4911]: I1201 00:38:13.318226 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-lmsf6" podStartSLOduration=2.350603538 podStartE2EDuration="10.318207486s" podCreationTimestamp="2025-12-01 00:38:03 +0000 UTC" firstStartedPulling="2025-12-01 00:38:04.160051328 +0000 UTC m=+1844.298748099" lastFinishedPulling="2025-12-01 00:38:12.127655266 +0000 UTC m=+1852.266352047" observedRunningTime="2025-12-01 00:38:13.315073178 +0000 UTC m=+1853.453769949" watchObservedRunningTime="2025-12-01 00:38:13.318207486 +0000 UTC m=+1853.456904257" Dec 01 00:38:15 crc kubenswrapper[4911]: I1201 00:38:15.312318 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerStarted","Data":"94bfecc6b61f73b77b4f69da0aa8e8ae2012c663ef1bee4623874b1b5b5e0307"} Dec 01 00:38:16 crc kubenswrapper[4911]: I1201 00:38:16.323046 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerStarted","Data":"3210ca508d00ec551d53d03c02eff6a59853cd924928d8ec8272468c8299777f"} Dec 01 00:38:17 crc kubenswrapper[4911]: I1201 00:38:17.152154 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:38:17 crc kubenswrapper[4911]: E1201 00:38:17.152630 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:38:18 crc kubenswrapper[4911]: I1201 00:38:18.337297 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerStarted","Data":"3088db9d96692c513e3d75579421e67dba91e2197d332e718e116d36adef909e"} Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.463350 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p"] Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.465329 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.468220 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.470080 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.472584 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.472753 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-tmxmb" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.485835 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p"] Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.520401 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/91ab9c57-2dae-40c0-84b1-aca2e491c08f-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.520444 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpls\" (UniqueName: \"kubernetes.io/projected/91ab9c57-2dae-40c0-84b1-aca2e491c08f-kube-api-access-bzpls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.520494 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.520515 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/91ab9c57-2dae-40c0-84b1-aca2e491c08f-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.520559 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.625181 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.625261 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/91ab9c57-2dae-40c0-84b1-aca2e491c08f-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.625287 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpls\" (UniqueName: \"kubernetes.io/projected/91ab9c57-2dae-40c0-84b1-aca2e491c08f-kube-api-access-bzpls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.625328 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.625352 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/91ab9c57-2dae-40c0-84b1-aca2e491c08f-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: E1201 00:38:21.625794 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Dec 01 00:38:21 crc kubenswrapper[4911]: E1201 00:38:21.625861 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls podName:91ab9c57-2dae-40c0-84b1-aca2e491c08f nodeName:}" failed. No retries permitted until 2025-12-01 00:38:22.125842876 +0000 UTC m=+1862.264539647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" (UID: "91ab9c57-2dae-40c0-84b1-aca2e491c08f") : secret "default-cloud1-coll-meter-proxy-tls" not found Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.626228 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/91ab9c57-2dae-40c0-84b1-aca2e491c08f-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.627720 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/91ab9c57-2dae-40c0-84b1-aca2e491c08f-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.632319 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:21 crc kubenswrapper[4911]: I1201 00:38:21.645184 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpls\" (UniqueName: \"kubernetes.io/projected/91ab9c57-2dae-40c0-84b1-aca2e491c08f-kube-api-access-bzpls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:22 crc kubenswrapper[4911]: I1201 00:38:22.132031 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:22 crc kubenswrapper[4911]: E1201 00:38:22.132235 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Dec 01 00:38:22 crc kubenswrapper[4911]: E1201 00:38:22.132324 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls podName:91ab9c57-2dae-40c0-84b1-aca2e491c08f nodeName:}" failed. No retries permitted until 2025-12-01 00:38:23.132305043 +0000 UTC m=+1863.271001814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" (UID: "91ab9c57-2dae-40c0-84b1-aca2e491c08f") : secret "default-cloud1-coll-meter-proxy-tls" not found Dec 01 00:38:22 crc kubenswrapper[4911]: I1201 00:38:22.378903 4911 generic.go:334] "Generic (PLEG): container finished" podID="9a5d7340-9706-4c29-95e4-eb116c175acc" containerID="94bfecc6b61f73b77b4f69da0aa8e8ae2012c663ef1bee4623874b1b5b5e0307" exitCode=0 Dec 01 00:38:22 crc kubenswrapper[4911]: I1201 00:38:22.378946 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerDied","Data":"94bfecc6b61f73b77b4f69da0aa8e8ae2012c663ef1bee4623874b1b5b5e0307"} Dec 01 00:38:23 crc kubenswrapper[4911]: I1201 00:38:23.145795 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:23 crc kubenswrapper[4911]: I1201 00:38:23.167106 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/91ab9c57-2dae-40c0-84b1-aca2e491c08f-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p\" (UID: \"91ab9c57-2dae-40c0-84b1-aca2e491c08f\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:23 crc kubenswrapper[4911]: I1201 00:38:23.287723 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.459109 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526"] Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.463107 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.465088 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.465321 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.473856 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526"] Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.563821 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d3409052-ad72-4833-a4f0-a0988b4f4814-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.563862 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.563893 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jp9\" (UniqueName: \"kubernetes.io/projected/d3409052-ad72-4833-a4f0-a0988b4f4814-kube-api-access-k8jp9\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.563976 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.564025 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3409052-ad72-4833-a4f0-a0988b4f4814-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.665636 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3409052-ad72-4833-a4f0-a0988b4f4814-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.665738 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d3409052-ad72-4833-a4f0-a0988b4f4814-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.665763 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.665787 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jp9\" (UniqueName: \"kubernetes.io/projected/d3409052-ad72-4833-a4f0-a0988b4f4814-kube-api-access-k8jp9\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.665885 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: E1201 00:38:24.666018 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Dec 01 00:38:24 crc kubenswrapper[4911]: E1201 00:38:24.666075 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls podName:d3409052-ad72-4833-a4f0-a0988b4f4814 nodeName:}" failed. No retries permitted until 2025-12-01 00:38:25.166058756 +0000 UTC m=+1865.304755527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" (UID: "d3409052-ad72-4833-a4f0-a0988b4f4814") : secret "default-cloud1-ceil-meter-proxy-tls" not found Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.666241 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3409052-ad72-4833-a4f0-a0988b4f4814-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.666644 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d3409052-ad72-4833-a4f0-a0988b4f4814-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.678399 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:24 crc kubenswrapper[4911]: I1201 00:38:24.686195 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jp9\" (UniqueName: \"kubernetes.io/projected/d3409052-ad72-4833-a4f0-a0988b4f4814-kube-api-access-k8jp9\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:25 crc kubenswrapper[4911]: I1201 00:38:25.174031 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:25 crc kubenswrapper[4911]: E1201 00:38:25.174116 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Dec 01 00:38:25 crc kubenswrapper[4911]: E1201 00:38:25.174388 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls podName:d3409052-ad72-4833-a4f0-a0988b4f4814 nodeName:}" failed. No retries permitted until 2025-12-01 00:38:26.174372664 +0000 UTC m=+1866.313069435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" (UID: "d3409052-ad72-4833-a4f0-a0988b4f4814") : secret "default-cloud1-ceil-meter-proxy-tls" not found Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.001888 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p"] Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.195980 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.202492 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3409052-ad72-4833-a4f0-a0988b4f4814-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-5d526\" (UID: \"d3409052-ad72-4833-a4f0-a0988b4f4814\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.283276 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" Dec 01 00:38:26 crc kubenswrapper[4911]: W1201 00:38:26.311444 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91ab9c57_2dae_40c0_84b1_aca2e491c08f.slice/crio-089e1d5ac04580badcddad644c9670cdda45c83132a892442719fcce47bda8e6 WatchSource:0}: Error finding container 089e1d5ac04580badcddad644c9670cdda45c83132a892442719fcce47bda8e6: Status 404 returned error can't find the container with id 089e1d5ac04580badcddad644c9670cdda45c83132a892442719fcce47bda8e6 Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.402969 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"089e1d5ac04580badcddad644c9670cdda45c83132a892442719fcce47bda8e6"} Dec 01 00:38:26 crc kubenswrapper[4911]: I1201 00:38:26.762056 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526"] Dec 01 00:38:26 crc kubenswrapper[4911]: W1201 00:38:26.772615 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3409052_ad72_4833_a4f0_a0988b4f4814.slice/crio-9e227fd2e0d223a2d7b94c6e3e32ed739c67a8ac43f288dfc5e323bdbd396288 WatchSource:0}: Error finding container 9e227fd2e0d223a2d7b94c6e3e32ed739c67a8ac43f288dfc5e323bdbd396288: Status 404 returned error can't find the container with id 9e227fd2e0d223a2d7b94c6e3e32ed739c67a8ac43f288dfc5e323bdbd396288 Dec 01 00:38:27 crc kubenswrapper[4911]: I1201 00:38:27.421367 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"3a840446-75d9-4c19-816b-1e0424759234","Type":"ContainerStarted","Data":"7184c5bf7fc318a0f0804c2b2f19d4e004acdd4306d9057d06058f3949a5d6df"} Dec 01 00:38:27 crc kubenswrapper[4911]: I1201 00:38:27.429258 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerStarted","Data":"da8b4f742cff736e8b955e997311971e44fa064c6dd2c5c51d752a482fefa6b6"} Dec 01 00:38:27 crc kubenswrapper[4911]: I1201 00:38:27.431229 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"0beb65b03cbd8a34151c18a29201f9e035349a92bb03b7d324525c5da1d38390"} Dec 01 00:38:27 crc kubenswrapper[4911]: I1201 00:38:27.432747 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"9e227fd2e0d223a2d7b94c6e3e32ed739c67a8ac43f288dfc5e323bdbd396288"} Dec 01 00:38:27 crc kubenswrapper[4911]: I1201 00:38:27.453192 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.1619381109999996 podStartE2EDuration="35.453172651s" podCreationTimestamp="2025-12-01 00:37:52 +0000 UTC" firstStartedPulling="2025-12-01 00:37:56.126523264 +0000 UTC m=+1836.265220075" lastFinishedPulling="2025-12-01 00:38:26.417757844 +0000 UTC m=+1866.556454615" observedRunningTime="2025-12-01 00:38:27.447501791 +0000 UTC m=+1867.586198572" watchObservedRunningTime="2025-12-01 00:38:27.453172651 +0000 UTC m=+1867.591869422" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.151894 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:38:28 crc kubenswrapper[4911]: E1201 00:38:28.152134 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.445951 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"f8175fa03961cfe4ba983854d9b9d36f358a2a9c1b9b3e5397dfad11ae3db5f6"} Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.638238 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p"] Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.639472 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.644342 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.644586 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.646757 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p"] Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.739212 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6b0cfd-fa42-480c-a9a0-d3425952b519-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.739542 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6b0cfd-fa42-480c-a9a0-d3425952b519-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.739690 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.739748 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.739799 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vm6\" (UniqueName: \"kubernetes.io/projected/1d6b0cfd-fa42-480c-a9a0-d3425952b519-kube-api-access-w8vm6\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842234 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6b0cfd-fa42-480c-a9a0-d3425952b519-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842312 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842340 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842367 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vm6\" (UniqueName: \"kubernetes.io/projected/1d6b0cfd-fa42-480c-a9a0-d3425952b519-kube-api-access-w8vm6\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6b0cfd-fa42-480c-a9a0-d3425952b519-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.842776 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6b0cfd-fa42-480c-a9a0-d3425952b519-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: E1201 00:38:28.842965 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Dec 01 00:38:28 crc kubenswrapper[4911]: E1201 00:38:28.843028 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls podName:1d6b0cfd-fa42-480c-a9a0-d3425952b519 nodeName:}" failed. No retries permitted until 2025-12-01 00:38:29.343011828 +0000 UTC m=+1869.481708589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" (UID: "1d6b0cfd-fa42-480c-a9a0-d3425952b519") : secret "default-cloud1-sens-meter-proxy-tls" not found Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.843568 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6b0cfd-fa42-480c-a9a0-d3425952b519-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.848868 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:28 crc kubenswrapper[4911]: I1201 00:38:28.861062 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vm6\" (UniqueName: \"kubernetes.io/projected/1d6b0cfd-fa42-480c-a9a0-d3425952b519-kube-api-access-w8vm6\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:29 crc kubenswrapper[4911]: I1201 00:38:29.349871 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:29 crc kubenswrapper[4911]: E1201 00:38:29.350096 4911 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Dec 01 00:38:29 crc kubenswrapper[4911]: E1201 00:38:29.350184 4911 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls podName:1d6b0cfd-fa42-480c-a9a0-d3425952b519 nodeName:}" failed. No retries permitted until 2025-12-01 00:38:30.350165274 +0000 UTC m=+1870.488862045 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" (UID: "1d6b0cfd-fa42-480c-a9a0-d3425952b519") : secret "default-cloud1-sens-meter-proxy-tls" not found Dec 01 00:38:29 crc kubenswrapper[4911]: I1201 00:38:29.464936 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerStarted","Data":"1f7ab47858f2979813e463cd6d54cce786ce6bb8883babe8b238d81eb33c0951"} Dec 01 00:38:30 crc kubenswrapper[4911]: I1201 00:38:30.366228 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:30 crc kubenswrapper[4911]: I1201 00:38:30.371619 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6b0cfd-fa42-480c-a9a0-d3425952b519-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p\" (UID: \"1d6b0cfd-fa42-480c-a9a0-d3425952b519\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:30 crc kubenswrapper[4911]: I1201 00:38:30.470900 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" Dec 01 00:38:30 crc kubenswrapper[4911]: I1201 00:38:30.593823 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.204770 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p"] Dec 01 00:38:34 crc kubenswrapper[4911]: W1201 00:38:34.344564 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d6b0cfd_fa42_480c_a9a0_d3425952b519.slice/crio-31a95dca5ca8ecf274a14e881cf5ba6de0544cbcbaa2b44592afe4b7a0f5be0c WatchSource:0}: Error finding container 31a95dca5ca8ecf274a14e881cf5ba6de0544cbcbaa2b44592afe4b7a0f5be0c: Status 404 returned error can't find the container with id 31a95dca5ca8ecf274a14e881cf5ba6de0544cbcbaa2b44592afe4b7a0f5be0c Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.512981 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"5b2f215c429c5db1e846c2d724a3b9e3c91d0628bfe4f36a652a31811c47476f"} Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.515133 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"d2bc8c6f7061437d5f596ca9c7ede4107b51e31c4f20b6ced73955999d608398"} Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.517346 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"31a95dca5ca8ecf274a14e881cf5ba6de0544cbcbaa2b44592afe4b7a0f5be0c"} Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.523699 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"9a5d7340-9706-4c29-95e4-eb116c175acc","Type":"ContainerStarted","Data":"3eda834d26cbb66c2fc72d916405511bc8696fd66c4cf67f04a435a9b8bcd782"} Dec 01 00:38:34 crc kubenswrapper[4911]: I1201 00:38:34.548389 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.921081838 podStartE2EDuration="28.548367335s" podCreationTimestamp="2025-12-01 00:38:06 +0000 UTC" firstStartedPulling="2025-12-01 00:38:22.380808418 +0000 UTC m=+1862.519505189" lastFinishedPulling="2025-12-01 00:38:34.008093915 +0000 UTC m=+1874.146790686" observedRunningTime="2025-12-01 00:38:34.542559161 +0000 UTC m=+1874.681255922" watchObservedRunningTime="2025-12-01 00:38:34.548367335 +0000 UTC m=+1874.687064106" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.653190 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d"] Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.654600 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.658786 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.658847 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.656287 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d"] Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.663484 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"47a8c064f3a5fe9bb45c26980039ede304f9f46af3fce782a9a9f7f90f4b5a58"} Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.663520 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"443e119ffe7e720cb0a8ab844cef06e008afad23b68778145940c35945ec5aa9"} Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.738771 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6c851284-97df-4a74-b9b3-991939ce2ebb-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.738983 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r89v5\" (UniqueName: \"kubernetes.io/projected/6c851284-97df-4a74-b9b3-991939ce2ebb-kube-api-access-r89v5\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.739212 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/6c851284-97df-4a74-b9b3-991939ce2ebb-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.739249 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c851284-97df-4a74-b9b3-991939ce2ebb-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.840231 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6c851284-97df-4a74-b9b3-991939ce2ebb-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.840312 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r89v5\" (UniqueName: \"kubernetes.io/projected/6c851284-97df-4a74-b9b3-991939ce2ebb-kube-api-access-r89v5\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.840397 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/6c851284-97df-4a74-b9b3-991939ce2ebb-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.840432 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c851284-97df-4a74-b9b3-991939ce2ebb-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.840907 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c851284-97df-4a74-b9b3-991939ce2ebb-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.843390 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6c851284-97df-4a74-b9b3-991939ce2ebb-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.849213 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/6c851284-97df-4a74-b9b3-991939ce2ebb-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.859659 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r89v5\" (UniqueName: \"kubernetes.io/projected/6c851284-97df-4a74-b9b3-991939ce2ebb-kube-api-access-r89v5\") pod \"default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d\" (UID: \"6c851284-97df-4a74-b9b3-991939ce2ebb\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:38 crc kubenswrapper[4911]: I1201 00:38:38.985578 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.356533 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f"] Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.358175 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.362619 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.364451 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f"] Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.512159 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.512283 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.512351 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vnx\" (UniqueName: \"kubernetes.io/projected/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-kube-api-access-s9vnx\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.512434 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.594364 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.613882 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.614195 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.614324 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vnx\" (UniqueName: \"kubernetes.io/projected/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-kube-api-access-s9vnx\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.614438 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.615054 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.617074 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.621049 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.645646 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vnx\" (UniqueName: \"kubernetes.io/projected/e6639df0-b3ed-44a8-84b7-8d7b5fd66df6-kube-api-access-s9vnx\") pod \"default-cloud1-ceil-event-smartgateway-695c8d666-n897f\" (UID: \"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.646570 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.686447 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" Dec 01 00:38:40 crc kubenswrapper[4911]: I1201 00:38:40.755992 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Dec 01 00:38:43 crc kubenswrapper[4911]: I1201 00:38:43.151794 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:38:43 crc kubenswrapper[4911]: E1201 00:38:43.152429 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.317357 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f"] Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.371096 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d"] Dec 01 00:38:48 crc kubenswrapper[4911]: W1201 00:38:48.381714 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c851284_97df_4a74_b9b3_991939ce2ebb.slice/crio-d8ad55844ab1d53980dcf822cca6ae1309a14612c9a58808e35821629a3802cd WatchSource:0}: Error finding container d8ad55844ab1d53980dcf822cca6ae1309a14612c9a58808e35821629a3802cd: Status 404 returned error can't find the container with id d8ad55844ab1d53980dcf822cca6ae1309a14612c9a58808e35821629a3802cd Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.783020 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"636dc8455f84201f43302eeaa5ab672106cd35f2202075f9dd58fce33515bb70"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.784984 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"b07c8d97f139af59c8aaf5f70ab1d3a179ad34f1cd50cc1751d4bc682d3d29f9"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.786932 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerStarted","Data":"2bfdd880be6d4e6e0fe9f6835ce548f0d1fa3043934cd773ff403407a42b1be3"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.786956 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerStarted","Data":"73e10cdaf402701ecf85e3289244a11896ef80ac6094c4003657bdd31f1d8b78"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.788552 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"88fde190cfb99155d707cc97505fde9a9c9eb7fa2df4f386e3e173da39bad19d"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.790141 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerStarted","Data":"48f98514846e04412ab43ceb1907233d386546a2209c6cfc55dec77f89e1b0cb"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.790291 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerStarted","Data":"d8ad55844ab1d53980dcf822cca6ae1309a14612c9a58808e35821629a3802cd"} Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.806936 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" podStartSLOduration=6.962757916 podStartE2EDuration="20.806914813s" podCreationTimestamp="2025-12-01 00:38:28 +0000 UTC" firstStartedPulling="2025-12-01 00:38:34.347104031 +0000 UTC m=+1874.485800792" lastFinishedPulling="2025-12-01 00:38:48.191260908 +0000 UTC m=+1888.329957689" observedRunningTime="2025-12-01 00:38:48.80182509 +0000 UTC m=+1888.940521861" watchObservedRunningTime="2025-12-01 00:38:48.806914813 +0000 UTC m=+1888.945611584" Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.832610 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" podStartSLOduration=6.127791809 podStartE2EDuration="27.832590597s" podCreationTimestamp="2025-12-01 00:38:21 +0000 UTC" firstStartedPulling="2025-12-01 00:38:26.422447066 +0000 UTC m=+1866.561143857" lastFinishedPulling="2025-12-01 00:38:48.127245854 +0000 UTC m=+1888.265942645" observedRunningTime="2025-12-01 00:38:48.831941459 +0000 UTC m=+1888.970638270" watchObservedRunningTime="2025-12-01 00:38:48.832590597 +0000 UTC m=+1888.971287368" Dec 01 00:38:48 crc kubenswrapper[4911]: I1201 00:38:48.860164 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" podStartSLOduration=3.496618065 podStartE2EDuration="24.860141253s" podCreationTimestamp="2025-12-01 00:38:24 +0000 UTC" firstStartedPulling="2025-12-01 00:38:26.775814517 +0000 UTC m=+1866.914511288" lastFinishedPulling="2025-12-01 00:38:48.139337695 +0000 UTC m=+1888.278034476" observedRunningTime="2025-12-01 00:38:48.855642087 +0000 UTC m=+1888.994338888" watchObservedRunningTime="2025-12-01 00:38:48.860141253 +0000 UTC m=+1888.998838024" Dec 01 00:38:49 crc kubenswrapper[4911]: I1201 00:38:49.798848 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerStarted","Data":"72834a714b64119c3ad58f7f98ab8c6175a9d03198b0e40798ab2d25b4d14c14"} Dec 01 00:38:49 crc kubenswrapper[4911]: I1201 00:38:49.800938 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerStarted","Data":"6bbcebbfb4c3819aeb499f94c26a4340fd20a7f9833eba3ec85e6384d187b38c"} Dec 01 00:38:49 crc kubenswrapper[4911]: I1201 00:38:49.816498 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" podStartSLOduration=9.482931698 podStartE2EDuration="9.8164779s" podCreationTimestamp="2025-12-01 00:38:40 +0000 UTC" firstStartedPulling="2025-12-01 00:38:48.350353193 +0000 UTC m=+1888.489049954" lastFinishedPulling="2025-12-01 00:38:48.683899385 +0000 UTC m=+1888.822596156" observedRunningTime="2025-12-01 00:38:49.814822984 +0000 UTC m=+1889.953519785" watchObservedRunningTime="2025-12-01 00:38:49.8164779 +0000 UTC m=+1889.955174691" Dec 01 00:38:49 crc kubenswrapper[4911]: I1201 00:38:49.847204 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" podStartSLOduration=12.518702937 podStartE2EDuration="12.847151445s" podCreationTimestamp="2025-12-01 00:38:37 +0000 UTC" firstStartedPulling="2025-12-01 00:38:48.385522455 +0000 UTC m=+1888.524219226" lastFinishedPulling="2025-12-01 00:38:48.713970963 +0000 UTC m=+1888.852667734" observedRunningTime="2025-12-01 00:38:49.843705928 +0000 UTC m=+1889.982402709" watchObservedRunningTime="2025-12-01 00:38:49.847151445 +0000 UTC m=+1889.985848226" Dec 01 00:38:52 crc kubenswrapper[4911]: I1201 00:38:52.938620 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:38:52 crc kubenswrapper[4911]: I1201 00:38:52.939411 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" podUID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" containerName="default-interconnect" containerID="cri-o://a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7" gracePeriod=30 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.342576 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517537 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517649 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517756 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517807 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517829 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4pxl\" (UniqueName: \"kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517863 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.517901 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials\") pod \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\" (UID: \"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8\") " Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.518525 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.523948 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl" (OuterVolumeSpecName: "kube-api-access-h4pxl") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "kube-api-access-h4pxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.524321 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.526579 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.527648 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.527728 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.527717 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" (UID: "2b3f0422-775e-4d08-a19c-9c9d0a5cfae8"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.619615 4911 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620006 4911 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620021 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4pxl\" (UniqueName: \"kubernetes.io/projected/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-kube-api-access-h4pxl\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620035 4911 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620047 4911 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620061 4911 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.620073 4911 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8-sasl-users\") on node \"crc\" DevicePath \"\"" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.834754 4911 generic.go:334] "Generic (PLEG): container finished" podID="91ab9c57-2dae-40c0-84b1-aca2e491c08f" containerID="5b2f215c429c5db1e846c2d724a3b9e3c91d0628bfe4f36a652a31811c47476f" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.834807 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerDied","Data":"5b2f215c429c5db1e846c2d724a3b9e3c91d0628bfe4f36a652a31811c47476f"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.835595 4911 scope.go:117] "RemoveContainer" containerID="5b2f215c429c5db1e846c2d724a3b9e3c91d0628bfe4f36a652a31811c47476f" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.837555 4911 generic.go:334] "Generic (PLEG): container finished" podID="e6639df0-b3ed-44a8-84b7-8d7b5fd66df6" containerID="2bfdd880be6d4e6e0fe9f6835ce548f0d1fa3043934cd773ff403407a42b1be3" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.837597 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerDied","Data":"2bfdd880be6d4e6e0fe9f6835ce548f0d1fa3043934cd773ff403407a42b1be3"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.838318 4911 scope.go:117] "RemoveContainer" containerID="2bfdd880be6d4e6e0fe9f6835ce548f0d1fa3043934cd773ff403407a42b1be3" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.840431 4911 generic.go:334] "Generic (PLEG): container finished" podID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" containerID="a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.840499 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" event={"ID":"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8","Type":"ContainerDied","Data":"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.840508 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.840567 4911 scope.go:117] "RemoveContainer" containerID="a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.840549 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4tvsb" event={"ID":"2b3f0422-775e-4d08-a19c-9c9d0a5cfae8","Type":"ContainerDied","Data":"1188909e38388071bb2da1c05ea0816cad5949d7bfff08a54ddf759bf6186682"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.848855 4911 generic.go:334] "Generic (PLEG): container finished" podID="d3409052-ad72-4833-a4f0-a0988b4f4814" containerID="d2bc8c6f7061437d5f596ca9c7ede4107b51e31c4f20b6ced73955999d608398" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.848957 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerDied","Data":"d2bc8c6f7061437d5f596ca9c7ede4107b51e31c4f20b6ced73955999d608398"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.849760 4911 scope.go:117] "RemoveContainer" containerID="d2bc8c6f7061437d5f596ca9c7ede4107b51e31c4f20b6ced73955999d608398" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.875539 4911 scope.go:117] "RemoveContainer" containerID="a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7" Dec 01 00:38:53 crc kubenswrapper[4911]: E1201 00:38:53.876071 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7\": container with ID starting with a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7 not found: ID does not exist" containerID="a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.876114 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7"} err="failed to get container status \"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7\": rpc error: code = NotFound desc = could not find container \"a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7\": container with ID starting with a479689a406541cc1ab5f86e14a851fc8b14e2cb98d49bd5812387b0beb263d7 not found: ID does not exist" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.891288 4911 generic.go:334] "Generic (PLEG): container finished" podID="6c851284-97df-4a74-b9b3-991939ce2ebb" containerID="48f98514846e04412ab43ceb1907233d386546a2209c6cfc55dec77f89e1b0cb" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.891366 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerDied","Data":"48f98514846e04412ab43ceb1907233d386546a2209c6cfc55dec77f89e1b0cb"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.891945 4911 scope.go:117] "RemoveContainer" containerID="48f98514846e04412ab43ceb1907233d386546a2209c6cfc55dec77f89e1b0cb" Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.935341 4911 generic.go:334] "Generic (PLEG): container finished" podID="1d6b0cfd-fa42-480c-a9a0-d3425952b519" containerID="47a8c064f3a5fe9bb45c26980039ede304f9f46af3fce782a9a9f7f90f4b5a58" exitCode=0 Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.935396 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerDied","Data":"47a8c064f3a5fe9bb45c26980039ede304f9f46af3fce782a9a9f7f90f4b5a58"} Dec 01 00:38:53 crc kubenswrapper[4911]: I1201 00:38:53.936084 4911 scope.go:117] "RemoveContainer" containerID="47a8c064f3a5fe9bb45c26980039ede304f9f46af3fce782a9a9f7f90f4b5a58" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.017862 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.046436 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4tvsb"] Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.179695 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" path="/var/lib/kubelet/pods/2b3f0422-775e-4d08-a19c-9c9d0a5cfae8/volumes" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.719708 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kk5bf"] Dec 01 00:38:54 crc kubenswrapper[4911]: E1201 00:38:54.719999 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" containerName="default-interconnect" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.720012 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" containerName="default-interconnect" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.720143 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3f0422-775e-4d08-a19c-9c9d0a5cfae8" containerName="default-interconnect" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.720660 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726076 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726125 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726167 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-z7k2d" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726324 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726513 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726564 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.726634 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.739443 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kk5bf"] Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843584 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-config\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843649 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843696 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843755 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-users\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843807 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4ts\" (UniqueName: \"kubernetes.io/projected/309c2661-dd65-4895-99f6-8e35d708d3ec-kube-api-access-wz4ts\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843842 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.843900 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.944879 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945247 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-config\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945061 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerStarted","Data":"f497e827d9828f194fb1e0047e22328ac20f7afb30083d059d37be8c7556140e"} Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945273 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945291 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945331 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-users\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4ts\" (UniqueName: \"kubernetes.io/projected/309c2661-dd65-4895-99f6-8e35d708d3ec-kube-api-access-wz4ts\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.945385 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.947001 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-config\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.951805 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.952135 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"e3337a0ff16a470331378639b658f52535a39780a559495698ad655b60171d16"} Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.952997 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.953290 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.954208 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-sasl-users\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.956901 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/309c2661-dd65-4895-99f6-8e35d708d3ec-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.957928 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerStarted","Data":"9dd0f3737027325615c545143694f70af275ea4d14a6d5a58c998e30dc37eb69"} Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.965123 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4ts\" (UniqueName: \"kubernetes.io/projected/309c2661-dd65-4895-99f6-8e35d708d3ec-kube-api-access-wz4ts\") pod \"default-interconnect-68864d46cb-kk5bf\" (UID: \"309c2661-dd65-4895-99f6-8e35d708d3ec\") " pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.966706 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"714d2e4266f244aa14b0d3bb53cdd6fef7c32af8efbbf3f02760ec90cbced29c"} Dec 01 00:38:54 crc kubenswrapper[4911]: I1201 00:38:54.970067 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"b6390d9d4fc32b5363965731716cb576fe49bc57069d648bc1ceb63e8d9f2ea2"} Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.035609 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.471828 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kk5bf"] Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.980680 4911 generic.go:334] "Generic (PLEG): container finished" podID="91ab9c57-2dae-40c0-84b1-aca2e491c08f" containerID="b6390d9d4fc32b5363965731716cb576fe49bc57069d648bc1ceb63e8d9f2ea2" exitCode=0 Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.980756 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerDied","Data":"b6390d9d4fc32b5363965731716cb576fe49bc57069d648bc1ceb63e8d9f2ea2"} Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.980823 4911 scope.go:117] "RemoveContainer" containerID="5b2f215c429c5db1e846c2d724a3b9e3c91d0628bfe4f36a652a31811c47476f" Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.981396 4911 scope.go:117] "RemoveContainer" containerID="b6390d9d4fc32b5363965731716cb576fe49bc57069d648bc1ceb63e8d9f2ea2" Dec 01 00:38:55 crc kubenswrapper[4911]: E1201 00:38:55.981772 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p_service-telemetry(91ab9c57-2dae-40c0-84b1-aca2e491c08f)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" podUID="91ab9c57-2dae-40c0-84b1-aca2e491c08f" Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.984407 4911 generic.go:334] "Generic (PLEG): container finished" podID="e6639df0-b3ed-44a8-84b7-8d7b5fd66df6" containerID="f497e827d9828f194fb1e0047e22328ac20f7afb30083d059d37be8c7556140e" exitCode=0 Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.984507 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerDied","Data":"f497e827d9828f194fb1e0047e22328ac20f7afb30083d059d37be8c7556140e"} Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.985034 4911 scope.go:117] "RemoveContainer" containerID="f497e827d9828f194fb1e0047e22328ac20f7afb30083d059d37be8c7556140e" Dec 01 00:38:55 crc kubenswrapper[4911]: E1201 00:38:55.985239 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-695c8d666-n897f_service-telemetry(e6639df0-b3ed-44a8-84b7-8d7b5fd66df6)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" podUID="e6639df0-b3ed-44a8-84b7-8d7b5fd66df6" Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.988357 4911 generic.go:334] "Generic (PLEG): container finished" podID="d3409052-ad72-4833-a4f0-a0988b4f4814" containerID="e3337a0ff16a470331378639b658f52535a39780a559495698ad655b60171d16" exitCode=0 Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.988409 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerDied","Data":"e3337a0ff16a470331378639b658f52535a39780a559495698ad655b60171d16"} Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.988916 4911 scope.go:117] "RemoveContainer" containerID="e3337a0ff16a470331378639b658f52535a39780a559495698ad655b60171d16" Dec 01 00:38:55 crc kubenswrapper[4911]: E1201 00:38:55.989092 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-5d526_service-telemetry(d3409052-ad72-4833-a4f0-a0988b4f4814)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" podUID="d3409052-ad72-4833-a4f0-a0988b4f4814" Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.991124 4911 generic.go:334] "Generic (PLEG): container finished" podID="6c851284-97df-4a74-b9b3-991939ce2ebb" containerID="9dd0f3737027325615c545143694f70af275ea4d14a6d5a58c998e30dc37eb69" exitCode=0 Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.991167 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerDied","Data":"9dd0f3737027325615c545143694f70af275ea4d14a6d5a58c998e30dc37eb69"} Dec 01 00:38:55 crc kubenswrapper[4911]: I1201 00:38:55.991498 4911 scope.go:117] "RemoveContainer" containerID="9dd0f3737027325615c545143694f70af275ea4d14a6d5a58c998e30dc37eb69" Dec 01 00:38:55 crc kubenswrapper[4911]: E1201 00:38:55.991673 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d_service-telemetry(6c851284-97df-4a74-b9b3-991939ce2ebb)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" podUID="6c851284-97df-4a74-b9b3-991939ce2ebb" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.008431 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" event={"ID":"309c2661-dd65-4895-99f6-8e35d708d3ec","Type":"ContainerStarted","Data":"975a8d1baf29010c439014c94b2affed90e05e35dc1f9f8880c3eb4eadd737e8"} Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.008510 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" event={"ID":"309c2661-dd65-4895-99f6-8e35d708d3ec","Type":"ContainerStarted","Data":"64a07823c7f37d9c914a418878968724396f3661481ec2814aa3531510f7cde0"} Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.017202 4911 generic.go:334] "Generic (PLEG): container finished" podID="1d6b0cfd-fa42-480c-a9a0-d3425952b519" containerID="714d2e4266f244aa14b0d3bb53cdd6fef7c32af8efbbf3f02760ec90cbced29c" exitCode=0 Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.017260 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerDied","Data":"714d2e4266f244aa14b0d3bb53cdd6fef7c32af8efbbf3f02760ec90cbced29c"} Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.017896 4911 scope.go:117] "RemoveContainer" containerID="714d2e4266f244aa14b0d3bb53cdd6fef7c32af8efbbf3f02760ec90cbced29c" Dec 01 00:38:56 crc kubenswrapper[4911]: E1201 00:38:56.018156 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p_service-telemetry(1d6b0cfd-fa42-480c-a9a0-d3425952b519)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" podUID="1d6b0cfd-fa42-480c-a9a0-d3425952b519" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.058704 4911 scope.go:117] "RemoveContainer" containerID="2bfdd880be6d4e6e0fe9f6835ce548f0d1fa3043934cd773ff403407a42b1be3" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.119281 4911 scope.go:117] "RemoveContainer" containerID="d2bc8c6f7061437d5f596ca9c7ede4107b51e31c4f20b6ced73955999d608398" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.155367 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-kk5bf" podStartSLOduration=4.1553510750000004 podStartE2EDuration="4.155351075s" podCreationTimestamp="2025-12-01 00:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 00:38:56.148868622 +0000 UTC m=+1896.287565393" watchObservedRunningTime="2025-12-01 00:38:56.155351075 +0000 UTC m=+1896.294047846" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.182356 4911 scope.go:117] "RemoveContainer" containerID="48f98514846e04412ab43ceb1907233d386546a2209c6cfc55dec77f89e1b0cb" Dec 01 00:38:56 crc kubenswrapper[4911]: I1201 00:38:56.220678 4911 scope.go:117] "RemoveContainer" containerID="47a8c064f3a5fe9bb45c26980039ede304f9f46af3fce782a9a9f7f90f4b5a58" Dec 01 00:38:58 crc kubenswrapper[4911]: I1201 00:38:58.152330 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:38:58 crc kubenswrapper[4911]: E1201 00:38:58.152793 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.051973 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.052981 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.055278 4911 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.060973 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.094515 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.173642 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-qdr-test-config\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.173685 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwm4\" (UniqueName: \"kubernetes.io/projected/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-kube-api-access-8mwm4\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.173729 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.274892 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-qdr-test-config\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.274959 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwm4\" (UniqueName: \"kubernetes.io/projected/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-kube-api-access-8mwm4\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.275375 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.275820 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-qdr-test-config\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.285178 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.304545 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwm4\" (UniqueName: \"kubernetes.io/projected/099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e-kube-api-access-8mwm4\") pod \"qdr-test\" (UID: \"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e\") " pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.372964 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Dec 01 00:39:02 crc kubenswrapper[4911]: I1201 00:39:02.578949 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Dec 01 00:39:03 crc kubenswrapper[4911]: I1201 00:39:03.104992 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e","Type":"ContainerStarted","Data":"f559b18e14c1479221c215ec88e2bcd13a453e21bf85d1e12093ac165ec5ae78"} Dec 01 00:39:08 crc kubenswrapper[4911]: I1201 00:39:08.154612 4911 scope.go:117] "RemoveContainer" containerID="b6390d9d4fc32b5363965731716cb576fe49bc57069d648bc1ceb63e8d9f2ea2" Dec 01 00:39:08 crc kubenswrapper[4911]: I1201 00:39:08.155230 4911 scope.go:117] "RemoveContainer" containerID="714d2e4266f244aa14b0d3bb53cdd6fef7c32af8efbbf3f02760ec90cbced29c" Dec 01 00:39:08 crc kubenswrapper[4911]: I1201 00:39:08.155906 4911 scope.go:117] "RemoveContainer" containerID="e3337a0ff16a470331378639b658f52535a39780a559495698ad655b60171d16" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.168893 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p" event={"ID":"1d6b0cfd-fa42-480c-a9a0-d3425952b519","Type":"ContainerStarted","Data":"bb55773f66deed14d869a8536c63e841743e604cc65250cee8362681aae5170d"} Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.180211 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e","Type":"ContainerStarted","Data":"5bfb7c7ef88565ad0c276c5b9f5f1ce985099b73e7a2ad8c712814cc42e0826d"} Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.183563 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p" event={"ID":"91ab9c57-2dae-40c0-84b1-aca2e491c08f","Type":"ContainerStarted","Data":"805164578f4275c8ffb4a958a147a9a07f6797764d84299f7b33f73f8f9c0f95"} Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.187596 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-5d526" event={"ID":"d3409052-ad72-4833-a4f0-a0988b4f4814","Type":"ContainerStarted","Data":"f4cd1528a5d6e790cbb155ac3c549112a55a91e951d68af0b582c9197b2e5539"} Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.232707 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.100933012 podStartE2EDuration="8.232689966s" podCreationTimestamp="2025-12-01 00:39:02 +0000 UTC" firstStartedPulling="2025-12-01 00:39:02.594721482 +0000 UTC m=+1902.733418253" lastFinishedPulling="2025-12-01 00:39:09.726478436 +0000 UTC m=+1909.865175207" observedRunningTime="2025-12-01 00:39:10.230858104 +0000 UTC m=+1910.369554895" watchObservedRunningTime="2025-12-01 00:39:10.232689966 +0000 UTC m=+1910.371386737" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.576773 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-59drf"] Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.578006 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.580749 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.585276 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.589342 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.589455 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.589565 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.590023 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.594807 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-59drf"] Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703166 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703206 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703321 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703365 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703394 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703449 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.703529 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxg7m\" (UniqueName: \"kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805115 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxg7m\" (UniqueName: \"kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805195 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805233 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805314 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805356 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805401 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.805503 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.806566 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.806708 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.806841 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.807143 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.807169 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.807575 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.842803 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxg7m\" (UniqueName: \"kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m\") pod \"stf-smoketest-smoke1-59drf\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:10 crc kubenswrapper[4911]: I1201 00:39:10.900163 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.003738 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.005011 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.033325 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.110588 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dmn\" (UniqueName: \"kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn\") pod \"curl\" (UID: \"90214ed1-19ba-4245-9b92-5c3362d6bc4a\") " pod="service-telemetry/curl" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.151912 4911 scope.go:117] "RemoveContainer" containerID="f497e827d9828f194fb1e0047e22328ac20f7afb30083d059d37be8c7556140e" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.152224 4911 scope.go:117] "RemoveContainer" containerID="9dd0f3737027325615c545143694f70af275ea4d14a6d5a58c998e30dc37eb69" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.193515 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-59drf"] Dec 01 00:39:11 crc kubenswrapper[4911]: W1201 00:39:11.199020 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod637dc0c7_ab77_424b_a1e0_43882b85445a.slice/crio-c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b WatchSource:0}: Error finding container c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b: Status 404 returned error can't find the container with id c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.212270 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dmn\" (UniqueName: \"kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn\") pod \"curl\" (UID: \"90214ed1-19ba-4245-9b92-5c3362d6bc4a\") " pod="service-telemetry/curl" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.233669 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dmn\" (UniqueName: \"kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn\") pod \"curl\" (UID: \"90214ed1-19ba-4245-9b92-5c3362d6bc4a\") " pod="service-telemetry/curl" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.359418 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Dec 01 00:39:11 crc kubenswrapper[4911]: I1201 00:39:11.608676 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Dec 01 00:39:12 crc kubenswrapper[4911]: I1201 00:39:12.221891 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerStarted","Data":"c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b"} Dec 01 00:39:12 crc kubenswrapper[4911]: I1201 00:39:12.231841 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695c8d666-n897f" event={"ID":"e6639df0-b3ed-44a8-84b7-8d7b5fd66df6","Type":"ContainerStarted","Data":"eb872f1b627dcbd8b44347597577047853d5942d827c3a6f918e2710f9136286"} Dec 01 00:39:12 crc kubenswrapper[4911]: I1201 00:39:12.234607 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"90214ed1-19ba-4245-9b92-5c3362d6bc4a","Type":"ContainerStarted","Data":"221e26fe2fa21cb7b74b43d1b90b37425bd324045720d27408434be9448744b6"} Dec 01 00:39:12 crc kubenswrapper[4911]: I1201 00:39:12.237868 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d" event={"ID":"6c851284-97df-4a74-b9b3-991939ce2ebb","Type":"ContainerStarted","Data":"6038f36f3f80e2e80b307e114886207b825801f21944d987e580c953e27c353d"} Dec 01 00:39:14 crc kubenswrapper[4911]: I1201 00:39:13.151664 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:39:14 crc kubenswrapper[4911]: E1201 00:39:13.151884 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:39:25 crc kubenswrapper[4911]: I1201 00:39:25.356901 4911 generic.go:334] "Generic (PLEG): container finished" podID="90214ed1-19ba-4245-9b92-5c3362d6bc4a" containerID="5950234f8737bf339783a6ca0c111ebb07ba364499bc596a5aa3a14ff21ee540" exitCode=0 Dec 01 00:39:25 crc kubenswrapper[4911]: I1201 00:39:25.357602 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"90214ed1-19ba-4245-9b92-5c3362d6bc4a","Type":"ContainerDied","Data":"5950234f8737bf339783a6ca0c111ebb07ba364499bc596a5aa3a14ff21ee540"} Dec 01 00:39:25 crc kubenswrapper[4911]: I1201 00:39:25.362376 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerStarted","Data":"1ae19b2c2843533e801c212f9e74b0608287175a363b24df49c14fcbb17b259f"} Dec 01 00:39:28 crc kubenswrapper[4911]: I1201 00:39:28.152099 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.432780 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"90214ed1-19ba-4245-9b92-5c3362d6bc4a","Type":"ContainerDied","Data":"221e26fe2fa21cb7b74b43d1b90b37425bd324045720d27408434be9448744b6"} Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.433490 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221e26fe2fa21cb7b74b43d1b90b37425bd324045720d27408434be9448744b6" Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.502241 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.594643 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87dmn\" (UniqueName: \"kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn\") pod \"90214ed1-19ba-4245-9b92-5c3362d6bc4a\" (UID: \"90214ed1-19ba-4245-9b92-5c3362d6bc4a\") " Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.799989 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn" (OuterVolumeSpecName: "kube-api-access-87dmn") pod "90214ed1-19ba-4245-9b92-5c3362d6bc4a" (UID: "90214ed1-19ba-4245-9b92-5c3362d6bc4a"). InnerVolumeSpecName "kube-api-access-87dmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.823634 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_90214ed1-19ba-4245-9b92-5c3362d6bc4a/curl/0.log" Dec 01 00:39:33 crc kubenswrapper[4911]: I1201 00:39:33.895890 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87dmn\" (UniqueName: \"kubernetes.io/projected/90214ed1-19ba-4245-9b92-5c3362d6bc4a-kube-api-access-87dmn\") on node \"crc\" DevicePath \"\"" Dec 01 00:39:34 crc kubenswrapper[4911]: I1201 00:39:34.089326 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lmsf6_7c4346a2-51ae-4df1-b881-d6eac247e811/prometheus-webhook-snmp/0.log" Dec 01 00:39:34 crc kubenswrapper[4911]: I1201 00:39:34.442947 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Dec 01 00:39:35 crc kubenswrapper[4911]: I1201 00:39:35.451345 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerStarted","Data":"ef1771f81432dd41defe0c825a66619f87af018d93536e5eaa832fee85b935b4"} Dec 01 00:39:35 crc kubenswrapper[4911]: I1201 00:39:35.455518 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587"} Dec 01 00:39:35 crc kubenswrapper[4911]: I1201 00:39:35.481772 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-59drf" podStartSLOduration=2.182331853 podStartE2EDuration="25.48175033s" podCreationTimestamp="2025-12-01 00:39:10 +0000 UTC" firstStartedPulling="2025-12-01 00:39:11.216738244 +0000 UTC m=+1911.355435015" lastFinishedPulling="2025-12-01 00:39:34.516156721 +0000 UTC m=+1934.654853492" observedRunningTime="2025-12-01 00:39:35.46965863 +0000 UTC m=+1935.608355411" watchObservedRunningTime="2025-12-01 00:39:35.48175033 +0000 UTC m=+1935.620447101" Dec 01 00:39:59 crc kubenswrapper[4911]: I1201 00:39:59.650886 4911 generic.go:334] "Generic (PLEG): container finished" podID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerID="1ae19b2c2843533e801c212f9e74b0608287175a363b24df49c14fcbb17b259f" exitCode=0 Dec 01 00:39:59 crc kubenswrapper[4911]: I1201 00:39:59.650991 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerDied","Data":"1ae19b2c2843533e801c212f9e74b0608287175a363b24df49c14fcbb17b259f"} Dec 01 00:39:59 crc kubenswrapper[4911]: I1201 00:39:59.652245 4911 scope.go:117] "RemoveContainer" containerID="1ae19b2c2843533e801c212f9e74b0608287175a363b24df49c14fcbb17b259f" Dec 01 00:40:04 crc kubenswrapper[4911]: I1201 00:40:04.244360 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lmsf6_7c4346a2-51ae-4df1-b881-d6eac247e811/prometheus-webhook-snmp/0.log" Dec 01 00:40:06 crc kubenswrapper[4911]: I1201 00:40:06.719002 4911 generic.go:334] "Generic (PLEG): container finished" podID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerID="ef1771f81432dd41defe0c825a66619f87af018d93536e5eaa832fee85b935b4" exitCode=0 Dec 01 00:40:06 crc kubenswrapper[4911]: I1201 00:40:06.719055 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerDied","Data":"ef1771f81432dd41defe0c825a66619f87af018d93536e5eaa832fee85b935b4"} Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.959159 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.998945 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999002 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999045 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999076 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999134 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxg7m\" (UniqueName: \"kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999197 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:07 crc kubenswrapper[4911]: I1201 00:40:07.999238 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script\") pod \"637dc0c7-ab77-424b-a1e0-43882b85445a\" (UID: \"637dc0c7-ab77-424b-a1e0-43882b85445a\") " Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.006364 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m" (OuterVolumeSpecName: "kube-api-access-jxg7m") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "kube-api-access-jxg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.022605 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.022826 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.022871 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.027259 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.027652 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.032419 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "637dc0c7-ab77-424b-a1e0-43882b85445a" (UID: "637dc0c7-ab77-424b-a1e0-43882b85445a"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100732 4911 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-sensubility-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100768 4911 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-healthcheck-log\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100778 4911 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-config\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100788 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100799 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxg7m\" (UniqueName: \"kubernetes.io/projected/637dc0c7-ab77-424b-a1e0-43882b85445a-kube-api-access-jxg7m\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100807 4911 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.100816 4911 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/637dc0c7-ab77-424b-a1e0-43882b85445a-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.740213 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-59drf" Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.741161 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-59drf" event={"ID":"637dc0c7-ab77-424b-a1e0-43882b85445a","Type":"ContainerDied","Data":"c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b"} Dec 01 00:40:08 crc kubenswrapper[4911]: I1201 00:40:08.741199 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61d271e9ccf40addbb214577d17c961769f7c3b9904eab739173f8d14fa949b" Dec 01 00:40:10 crc kubenswrapper[4911]: I1201 00:40:10.042452 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-59drf_637dc0c7-ab77-424b-a1e0-43882b85445a/smoketest-collectd/0.log" Dec 01 00:40:10 crc kubenswrapper[4911]: I1201 00:40:10.390352 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-59drf_637dc0c7-ab77-424b-a1e0-43882b85445a/smoketest-ceilometer/0.log" Dec 01 00:40:10 crc kubenswrapper[4911]: I1201 00:40:10.678958 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-kk5bf_309c2661-dd65-4895-99f6-8e35d708d3ec/default-interconnect/0.log" Dec 01 00:40:10 crc kubenswrapper[4911]: I1201 00:40:10.987274 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p_91ab9c57-2dae-40c0-84b1-aca2e491c08f/bridge/2.log" Dec 01 00:40:11 crc kubenswrapper[4911]: I1201 00:40:11.299393 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-26t2p_91ab9c57-2dae-40c0-84b1-aca2e491c08f/sg-core/0.log" Dec 01 00:40:11 crc kubenswrapper[4911]: I1201 00:40:11.641825 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d_6c851284-97df-4a74-b9b3-991939ce2ebb/bridge/2.log" Dec 01 00:40:11 crc kubenswrapper[4911]: I1201 00:40:11.975119 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b58d8959d-qq56d_6c851284-97df-4a74-b9b3-991939ce2ebb/sg-core/0.log" Dec 01 00:40:12 crc kubenswrapper[4911]: I1201 00:40:12.251103 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-5d526_d3409052-ad72-4833-a4f0-a0988b4f4814/bridge/2.log" Dec 01 00:40:12 crc kubenswrapper[4911]: I1201 00:40:12.609921 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-5d526_d3409052-ad72-4833-a4f0-a0988b4f4814/sg-core/0.log" Dec 01 00:40:12 crc kubenswrapper[4911]: I1201 00:40:12.899729 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695c8d666-n897f_e6639df0-b3ed-44a8-84b7-8d7b5fd66df6/bridge/2.log" Dec 01 00:40:13 crc kubenswrapper[4911]: I1201 00:40:13.142046 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695c8d666-n897f_e6639df0-b3ed-44a8-84b7-8d7b5fd66df6/sg-core/0.log" Dec 01 00:40:13 crc kubenswrapper[4911]: I1201 00:40:13.407639 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p_1d6b0cfd-fa42-480c-a9a0-d3425952b519/bridge/2.log" Dec 01 00:40:13 crc kubenswrapper[4911]: I1201 00:40:13.674595 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-n8b4p_1d6b0cfd-fa42-480c-a9a0-d3425952b519/sg-core/0.log" Dec 01 00:40:16 crc kubenswrapper[4911]: I1201 00:40:16.593129 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6c444f9775-5nwgx_f172be71-8395-4d1e-a2ce-8692cde84dd1/operator/0.log" Dec 01 00:40:16 crc kubenswrapper[4911]: I1201 00:40:16.865623 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_3a840446-75d9-4c19-816b-1e0424759234/prometheus/0.log" Dec 01 00:40:17 crc kubenswrapper[4911]: I1201 00:40:17.164290 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_97613b36-7079-4bac-afc8-0c933bcb2d4d/elasticsearch/0.log" Dec 01 00:40:17 crc kubenswrapper[4911]: I1201 00:40:17.489082 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-lmsf6_7c4346a2-51ae-4df1-b881-d6eac247e811/prometheus-webhook-snmp/0.log" Dec 01 00:40:17 crc kubenswrapper[4911]: I1201 00:40:17.789066 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_9a5d7340-9706-4c29-95e4-eb116c175acc/alertmanager/0.log" Dec 01 00:40:34 crc kubenswrapper[4911]: I1201 00:40:34.670300 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5d44b4d989-2ktqj_7514cd3a-f88e-48ee-9c94-7e6cd903201e/operator/0.log" Dec 01 00:40:37 crc kubenswrapper[4911]: I1201 00:40:37.839821 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-6c444f9775-5nwgx_f172be71-8395-4d1e-a2ce-8692cde84dd1/operator/0.log" Dec 01 00:40:38 crc kubenswrapper[4911]: I1201 00:40:38.101238 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_099f5cb4-cb7e-46cb-94cc-c2c5a2680a0e/qdr/0.log" Dec 01 00:40:52 crc kubenswrapper[4911]: I1201 00:40:52.543271 4911 scope.go:117] "RemoveContainer" containerID="b1a7af46c8c53178acabb269bd37c81e0ab7744a6405b22a0f744963654d844d" Dec 01 00:40:52 crc kubenswrapper[4911]: I1201 00:40:52.582953 4911 scope.go:117] "RemoveContainer" containerID="f006f4209131e0c4eaf3adce194c71a64279499ef4eb11160ecdfa89147af76d" Dec 01 00:40:52 crc kubenswrapper[4911]: I1201 00:40:52.618895 4911 scope.go:117] "RemoveContainer" containerID="7ac1d8ba38f4f724dc6f314b2509a8f178d12f93538cc091da7cb88c10845f41" Dec 01 00:40:52 crc kubenswrapper[4911]: I1201 00:40:52.657666 4911 scope.go:117] "RemoveContainer" containerID="d8032d4133d583694274d940c88b8d4fc82b447d0f789cedb3ba9c845a865519" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.347521 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2dqjc/must-gather-xqx6r"] Dec 01 00:41:03 crc kubenswrapper[4911]: E1201 00:41:03.348347 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90214ed1-19ba-4245-9b92-5c3362d6bc4a" containerName="curl" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348368 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="90214ed1-19ba-4245-9b92-5c3362d6bc4a" containerName="curl" Dec 01 00:41:03 crc kubenswrapper[4911]: E1201 00:41:03.348384 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-collectd" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348393 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-collectd" Dec 01 00:41:03 crc kubenswrapper[4911]: E1201 00:41:03.348403 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-ceilometer" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348409 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-ceilometer" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348566 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-ceilometer" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348586 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="637dc0c7-ab77-424b-a1e0-43882b85445a" containerName="smoketest-collectd" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.348596 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="90214ed1-19ba-4245-9b92-5c3362d6bc4a" containerName="curl" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.349284 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.351002 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2dqjc"/"openshift-service-ca.crt" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.351405 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2dqjc"/"kube-root-ca.crt" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.353354 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2dqjc"/"default-dockercfg-srkpb" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.358747 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2dqjc/must-gather-xqx6r"] Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.548015 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bmg\" (UniqueName: \"kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.548439 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.649807 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bmg\" (UniqueName: \"kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.649871 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.650394 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.670977 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bmg\" (UniqueName: \"kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg\") pod \"must-gather-xqx6r\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:03 crc kubenswrapper[4911]: I1201 00:41:03.965422 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:41:04 crc kubenswrapper[4911]: I1201 00:41:04.201917 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2dqjc/must-gather-xqx6r"] Dec 01 00:41:04 crc kubenswrapper[4911]: W1201 00:41:04.211084 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod284579be_6131_442c_b03e_95e9ed71262a.slice/crio-6c37e4a5e3ea36b740261f194ce331fb18a03ca5dcab57002454f96ecaef4b59 WatchSource:0}: Error finding container 6c37e4a5e3ea36b740261f194ce331fb18a03ca5dcab57002454f96ecaef4b59: Status 404 returned error can't find the container with id 6c37e4a5e3ea36b740261f194ce331fb18a03ca5dcab57002454f96ecaef4b59 Dec 01 00:41:05 crc kubenswrapper[4911]: I1201 00:41:05.219951 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" event={"ID":"284579be-6131-442c-b03e-95e9ed71262a","Type":"ContainerStarted","Data":"6c37e4a5e3ea36b740261f194ce331fb18a03ca5dcab57002454f96ecaef4b59"} Dec 01 00:41:09 crc kubenswrapper[4911]: I1201 00:41:09.249660 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" event={"ID":"284579be-6131-442c-b03e-95e9ed71262a","Type":"ContainerStarted","Data":"35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44"} Dec 01 00:41:09 crc kubenswrapper[4911]: I1201 00:41:09.251249 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" event={"ID":"284579be-6131-442c-b03e-95e9ed71262a","Type":"ContainerStarted","Data":"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8"} Dec 01 00:41:09 crc kubenswrapper[4911]: I1201 00:41:09.266982 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" podStartSLOduration=1.97428241 podStartE2EDuration="6.266964614s" podCreationTimestamp="2025-12-01 00:41:03 +0000 UTC" firstStartedPulling="2025-12-01 00:41:04.215416006 +0000 UTC m=+2024.354112777" lastFinishedPulling="2025-12-01 00:41:08.50809821 +0000 UTC m=+2028.646794981" observedRunningTime="2025-12-01 00:41:09.264339829 +0000 UTC m=+2029.403036590" watchObservedRunningTime="2025-12-01 00:41:09.266964614 +0000 UTC m=+2029.405661395" Dec 01 00:41:51 crc kubenswrapper[4911]: I1201 00:41:51.311739 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:41:51 crc kubenswrapper[4911]: I1201 00:41:51.312273 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:41:51 crc kubenswrapper[4911]: I1201 00:41:51.977627 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tv9lz_5f81c0bb-de86-40af-8412-ceb41bf9478e/control-plane-machine-set-operator/0.log" Dec 01 00:41:52 crc kubenswrapper[4911]: I1201 00:41:52.125090 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4j8rl_f4878bb4-63b2-481b-8055-dc5d69809b39/kube-rbac-proxy/0.log" Dec 01 00:41:52 crc kubenswrapper[4911]: I1201 00:41:52.130581 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4j8rl_f4878bb4-63b2-481b-8055-dc5d69809b39/machine-api-operator/0.log" Dec 01 00:42:03 crc kubenswrapper[4911]: I1201 00:42:03.290721 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-f775q_5511c855-0f49-4d84-83da-32932f2e4b1a/cert-manager-controller/0.log" Dec 01 00:42:03 crc kubenswrapper[4911]: I1201 00:42:03.385923 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-q54sn_0daadb0a-db56-459b-b756-0f57d9dc0529/cert-manager-cainjector/0.log" Dec 01 00:42:03 crc kubenswrapper[4911]: I1201 00:42:03.452415 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z25xd_bebc7141-b369-409e-8629-25e95690723b/cert-manager-webhook/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.123271 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/util/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.231095 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/util/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.285015 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/pull/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.304844 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/pull/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.619186 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/pull/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.619604 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/extract/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.671226 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931arrf77_7d8f6326-b67c-4682-a37b-a79fb151552c/util/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.773274 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/util/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.930640 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/pull/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.934503 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/pull/0.log" Dec 01 00:42:19 crc kubenswrapper[4911]: I1201 00:42:19.953503 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/util/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.197365 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/pull/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.197654 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/extract/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.202193 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210ggspg_a48e7868-4393-43db-afbe-4752ecf2c918/util/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.381548 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/util/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.524632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/util/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.528668 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/pull/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.552263 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/pull/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.691375 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/util/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.739066 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/pull/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.754408 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fpp7ws_25ac6dcc-fada-4401-91b8-a5d37711b1ec/extract/0.log" Dec 01 00:42:20 crc kubenswrapper[4911]: I1201 00:42:20.871552 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/util/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.003404 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/pull/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.036840 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/util/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.037569 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/pull/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.306839 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/pull/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.309743 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/util/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.311070 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.311087 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5egf7zq_e02a088d-f2be-4aaf-bca1-fa4858cea430/extract/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.311116 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.485276 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-utilities/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.635448 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-utilities/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.661265 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-content/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.663242 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-content/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.880331 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-utilities/0.log" Dec 01 00:42:21 crc kubenswrapper[4911]: I1201 00:42:21.919942 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/extract-content/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.089632 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-utilities/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.267198 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxg8r_f2e3a5b9-bf28-4144-8cc7-9c66843eccb5/registry-server/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.333066 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-utilities/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.334007 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-content/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.357825 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-content/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.493715 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-utilities/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.535143 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/extract-content/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.756123 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-utilities/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.791955 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-v4cgp_6f286bcc-7bb8-4571-a057-4db77eee17a6/marketplace-operator/0.log" Dec 01 00:42:22 crc kubenswrapper[4911]: I1201 00:42:22.874878 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gzf88_7cf4caa9-984a-4b1f-8fe8-23c79dd9df8d/registry-server/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.002111 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-utilities/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.034122 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-content/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.038569 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-content/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.295853 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-utilities/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.415144 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/extract-content/0.log" Dec 01 00:42:23 crc kubenswrapper[4911]: I1201 00:42:23.415959 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8gsmv_6ac2dde0-9bc9-4dec-ae5f-dcb44c02b24a/registry-server/0.log" Dec 01 00:42:36 crc kubenswrapper[4911]: I1201 00:42:36.010669 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-twxt7_3dcb1345-86cc-4712-a549-5ec7b06343f3/prometheus-operator/0.log" Dec 01 00:42:36 crc kubenswrapper[4911]: I1201 00:42:36.290268 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78b478d44c-gzwsh_6b5103e9-aa5a-402b-a755-a2f2be984479/prometheus-operator-admission-webhook/0.log" Dec 01 00:42:36 crc kubenswrapper[4911]: I1201 00:42:36.303300 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78b478d44c-rvs92_ac116d8a-ec46-415a-b9bb-357493c28dda/prometheus-operator-admission-webhook/0.log" Dec 01 00:42:36 crc kubenswrapper[4911]: I1201 00:42:36.517763 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-hmcx6_b29bd7a0-85c7-43bc-8bab-adcafae9d8dc/operator/0.log" Dec 01 00:42:36 crc kubenswrapper[4911]: I1201 00:42:36.517983 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-9sg5q_b2a8cc4e-d5e8-4825-a42b-9b8534030ff8/perses-operator/0.log" Dec 01 00:42:51 crc kubenswrapper[4911]: I1201 00:42:51.311571 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:42:51 crc kubenswrapper[4911]: I1201 00:42:51.312074 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:42:51 crc kubenswrapper[4911]: I1201 00:42:51.312114 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:42:51 crc kubenswrapper[4911]: I1201 00:42:51.312689 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:42:51 crc kubenswrapper[4911]: I1201 00:42:51.312735 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587" gracePeriod=600 Dec 01 00:42:52 crc kubenswrapper[4911]: I1201 00:42:52.030677 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587" exitCode=0 Dec 01 00:42:52 crc kubenswrapper[4911]: I1201 00:42:52.030742 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587"} Dec 01 00:42:52 crc kubenswrapper[4911]: I1201 00:42:52.031225 4911 scope.go:117] "RemoveContainer" containerID="644bb114d0c5a2ca4fe37d681268e8046e5a6f3f4735a3340b1538c13a8d2097" Dec 01 00:42:53 crc kubenswrapper[4911]: I1201 00:42:53.045391 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerStarted","Data":"c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069"} Dec 01 00:43:25 crc kubenswrapper[4911]: I1201 00:43:25.402119 4911 generic.go:334] "Generic (PLEG): container finished" podID="284579be-6131-442c-b03e-95e9ed71262a" containerID="7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8" exitCode=0 Dec 01 00:43:25 crc kubenswrapper[4911]: I1201 00:43:25.402191 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" event={"ID":"284579be-6131-442c-b03e-95e9ed71262a","Type":"ContainerDied","Data":"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8"} Dec 01 00:43:25 crc kubenswrapper[4911]: I1201 00:43:25.403316 4911 scope.go:117] "RemoveContainer" containerID="7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8" Dec 01 00:43:25 crc kubenswrapper[4911]: I1201 00:43:25.529769 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dqjc_must-gather-xqx6r_284579be-6131-442c-b03e-95e9ed71262a/gather/0.log" Dec 01 00:43:32 crc kubenswrapper[4911]: I1201 00:43:32.530082 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2dqjc/must-gather-xqx6r"] Dec 01 00:43:32 crc kubenswrapper[4911]: I1201 00:43:32.530637 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="copy" containerID="cri-o://35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44" gracePeriod=2 Dec 01 00:43:32 crc kubenswrapper[4911]: I1201 00:43:32.537512 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2dqjc/must-gather-xqx6r"] Dec 01 00:43:32 crc kubenswrapper[4911]: I1201 00:43:32.990853 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dqjc_must-gather-xqx6r_284579be-6131-442c-b03e-95e9ed71262a/copy/0.log" Dec 01 00:43:32 crc kubenswrapper[4911]: I1201 00:43:32.991851 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.120957 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8bmg\" (UniqueName: \"kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg\") pod \"284579be-6131-442c-b03e-95e9ed71262a\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.121174 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output\") pod \"284579be-6131-442c-b03e-95e9ed71262a\" (UID: \"284579be-6131-442c-b03e-95e9ed71262a\") " Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.134683 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg" (OuterVolumeSpecName: "kube-api-access-f8bmg") pod "284579be-6131-442c-b03e-95e9ed71262a" (UID: "284579be-6131-442c-b03e-95e9ed71262a"). InnerVolumeSpecName "kube-api-access-f8bmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.172989 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "284579be-6131-442c-b03e-95e9ed71262a" (UID: "284579be-6131-442c-b03e-95e9ed71262a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.222884 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8bmg\" (UniqueName: \"kubernetes.io/projected/284579be-6131-442c-b03e-95e9ed71262a-kube-api-access-f8bmg\") on node \"crc\" DevicePath \"\"" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.223197 4911 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/284579be-6131-442c-b03e-95e9ed71262a-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.503077 4911 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2dqjc_must-gather-xqx6r_284579be-6131-442c-b03e-95e9ed71262a/copy/0.log" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.503478 4911 generic.go:334] "Generic (PLEG): container finished" podID="284579be-6131-442c-b03e-95e9ed71262a" containerID="35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44" exitCode=143 Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.503525 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2dqjc/must-gather-xqx6r" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.503532 4911 scope.go:117] "RemoveContainer" containerID="35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.525734 4911 scope.go:117] "RemoveContainer" containerID="7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.562751 4911 scope.go:117] "RemoveContainer" containerID="35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44" Dec 01 00:43:33 crc kubenswrapper[4911]: E1201 00:43:33.563176 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44\": container with ID starting with 35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44 not found: ID does not exist" containerID="35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.563215 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44"} err="failed to get container status \"35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44\": rpc error: code = NotFound desc = could not find container \"35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44\": container with ID starting with 35e0a46c99f7d32f7ec986b9e9d2668e2bad165fef37962cfd6d506f40af7a44 not found: ID does not exist" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.563237 4911 scope.go:117] "RemoveContainer" containerID="7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8" Dec 01 00:43:33 crc kubenswrapper[4911]: E1201 00:43:33.563710 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8\": container with ID starting with 7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8 not found: ID does not exist" containerID="7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8" Dec 01 00:43:33 crc kubenswrapper[4911]: I1201 00:43:33.563754 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8"} err="failed to get container status \"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8\": rpc error: code = NotFound desc = could not find container \"7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8\": container with ID starting with 7d6df6da9de7b4a8753d3930a750bb0768991c55abfacf39979f5c40cc3198a8 not found: ID does not exist" Dec 01 00:43:34 crc kubenswrapper[4911]: I1201 00:43:34.161050 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284579be-6131-442c-b03e-95e9ed71262a" path="/var/lib/kubelet/pods/284579be-6131-442c-b03e-95e9ed71262a/volumes" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.174400 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq"] Dec 01 00:45:00 crc kubenswrapper[4911]: E1201 00:45:00.175035 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="gather" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.175046 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="gather" Dec 01 00:45:00 crc kubenswrapper[4911]: E1201 00:45:00.175087 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="copy" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.175094 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="copy" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.175209 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="copy" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.175218 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="284579be-6131-442c-b03e-95e9ed71262a" containerName="gather" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.175753 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.180088 4911 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.182704 4911 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.187431 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq"] Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.322437 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.322872 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qlwc\" (UniqueName: \"kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.322966 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.425203 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.425343 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qlwc\" (UniqueName: \"kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.425366 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.426305 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.434902 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.446295 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qlwc\" (UniqueName: \"kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc\") pod \"collect-profiles-29409165-klncq\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.509836 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:00 crc kubenswrapper[4911]: W1201 00:45:00.975754 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a34dd1_5f54_4003_adac_6727e0325cca.slice/crio-17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d WatchSource:0}: Error finding container 17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d: Status 404 returned error can't find the container with id 17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d Dec 01 00:45:00 crc kubenswrapper[4911]: I1201 00:45:00.976917 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq"] Dec 01 00:45:01 crc kubenswrapper[4911]: I1201 00:45:01.359663 4911 generic.go:334] "Generic (PLEG): container finished" podID="94a34dd1-5f54-4003-adac-6727e0325cca" containerID="53aac13a41bbfd157615960498d062fd35d9accb983463ee3cc974ff90d67008" exitCode=0 Dec 01 00:45:01 crc kubenswrapper[4911]: I1201 00:45:01.359722 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" event={"ID":"94a34dd1-5f54-4003-adac-6727e0325cca","Type":"ContainerDied","Data":"53aac13a41bbfd157615960498d062fd35d9accb983463ee3cc974ff90d67008"} Dec 01 00:45:01 crc kubenswrapper[4911]: I1201 00:45:01.359970 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" event={"ID":"94a34dd1-5f54-4003-adac-6727e0325cca","Type":"ContainerStarted","Data":"17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d"} Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.645426 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.660518 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume\") pod \"94a34dd1-5f54-4003-adac-6727e0325cca\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.660600 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume\") pod \"94a34dd1-5f54-4003-adac-6727e0325cca\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.660664 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qlwc\" (UniqueName: \"kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc\") pod \"94a34dd1-5f54-4003-adac-6727e0325cca\" (UID: \"94a34dd1-5f54-4003-adac-6727e0325cca\") " Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.662028 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume" (OuterVolumeSpecName: "config-volume") pod "94a34dd1-5f54-4003-adac-6727e0325cca" (UID: "94a34dd1-5f54-4003-adac-6727e0325cca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.667657 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc" (OuterVolumeSpecName: "kube-api-access-9qlwc") pod "94a34dd1-5f54-4003-adac-6727e0325cca" (UID: "94a34dd1-5f54-4003-adac-6727e0325cca"). InnerVolumeSpecName "kube-api-access-9qlwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.667725 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94a34dd1-5f54-4003-adac-6727e0325cca" (UID: "94a34dd1-5f54-4003-adac-6727e0325cca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.762270 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qlwc\" (UniqueName: \"kubernetes.io/projected/94a34dd1-5f54-4003-adac-6727e0325cca-kube-api-access-9qlwc\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.762306 4911 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a34dd1-5f54-4003-adac-6727e0325cca-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:02 crc kubenswrapper[4911]: I1201 00:45:02.762319 4911 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a34dd1-5f54-4003-adac-6727e0325cca-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:03 crc kubenswrapper[4911]: I1201 00:45:03.378400 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" event={"ID":"94a34dd1-5f54-4003-adac-6727e0325cca","Type":"ContainerDied","Data":"17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d"} Dec 01 00:45:03 crc kubenswrapper[4911]: I1201 00:45:03.378448 4911 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17d80e5a945334ac5a0ab146c53c8d2be90c3913029903934e043470c39a3b0d" Dec 01 00:45:03 crc kubenswrapper[4911]: I1201 00:45:03.378500 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409165-klncq" Dec 01 00:45:03 crc kubenswrapper[4911]: I1201 00:45:03.755291 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m"] Dec 01 00:45:03 crc kubenswrapper[4911]: I1201 00:45:03.763749 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409120-zht4m"] Dec 01 00:45:04 crc kubenswrapper[4911]: I1201 00:45:04.168738 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd65335-e065-4572-9e3d-912fe012056b" path="/var/lib/kubelet/pods/3dd65335-e065-4572-9e3d-912fe012056b/volumes" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.516394 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:05 crc kubenswrapper[4911]: E1201 00:45:05.517045 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a34dd1-5f54-4003-adac-6727e0325cca" containerName="collect-profiles" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.517062 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a34dd1-5f54-4003-adac-6727e0325cca" containerName="collect-profiles" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.517270 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a34dd1-5f54-4003-adac-6727e0325cca" containerName="collect-profiles" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.518451 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.526370 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.705682 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.707311 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.710306 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.710351 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgqmv\" (UniqueName: \"kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.710384 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.732605 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811544 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811646 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qd22\" (UniqueName: \"kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811689 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811716 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811755 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgqmv\" (UniqueName: \"kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.811785 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.812270 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.812384 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.833409 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgqmv\" (UniqueName: \"kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv\") pod \"redhat-operators-scx5b\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.849001 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.913378 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qd22\" (UniqueName: \"kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.913726 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.913792 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.914388 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.914490 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:05 crc kubenswrapper[4911]: I1201 00:45:05.929581 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qd22\" (UniqueName: \"kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22\") pod \"certified-operators-vs5cb\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:06 crc kubenswrapper[4911]: I1201 00:45:06.025829 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:06 crc kubenswrapper[4911]: I1201 00:45:06.295304 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:06 crc kubenswrapper[4911]: I1201 00:45:06.322326 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:06 crc kubenswrapper[4911]: W1201 00:45:06.351439 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8331c199_8cdd_442d_a5eb_7400c386cdb5.slice/crio-6f342698d3dcf75c33fb893ca8b005fa55989d9b605352dd06a8e25995a16f70 WatchSource:0}: Error finding container 6f342698d3dcf75c33fb893ca8b005fa55989d9b605352dd06a8e25995a16f70: Status 404 returned error can't find the container with id 6f342698d3dcf75c33fb893ca8b005fa55989d9b605352dd06a8e25995a16f70 Dec 01 00:45:06 crc kubenswrapper[4911]: I1201 00:45:06.407937 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerStarted","Data":"6f342698d3dcf75c33fb893ca8b005fa55989d9b605352dd06a8e25995a16f70"} Dec 01 00:45:06 crc kubenswrapper[4911]: I1201 00:45:06.417972 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerStarted","Data":"d92de6e0334daf55cf4eb5b17bee67f134ef84bda263d06a1414cc3c732fca69"} Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.427698 4911 generic.go:334] "Generic (PLEG): container finished" podID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerID="bd0f7021b5ea0b50aad80f6ff41908e72bf0ad687da6c170520f65b0d1780c2f" exitCode=0 Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.427871 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerDied","Data":"bd0f7021b5ea0b50aad80f6ff41908e72bf0ad687da6c170520f65b0d1780c2f"} Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.430509 4911 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.431047 4911 generic.go:334] "Generic (PLEG): container finished" podID="968ade4a-219b-4b9a-966a-1db41435c533" containerID="bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d" exitCode=0 Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.431096 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerDied","Data":"bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d"} Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.925797 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:07 crc kubenswrapper[4911]: I1201 00:45:07.927310 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:07.937519 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:07.942364 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxxh\" (UniqueName: \"kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh\") pod \"service-telemetry-framework-operators-xbv4m\" (UID: \"11a90d3d-9060-4c92-a342-4526d3eaa801\") " pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:08.043728 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxxh\" (UniqueName: \"kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh\") pod \"service-telemetry-framework-operators-xbv4m\" (UID: \"11a90d3d-9060-4c92-a342-4526d3eaa801\") " pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:08.063337 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxxh\" (UniqueName: \"kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh\") pod \"service-telemetry-framework-operators-xbv4m\" (UID: \"11a90d3d-9060-4c92-a342-4526d3eaa801\") " pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:08.291739 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:08 crc kubenswrapper[4911]: I1201 00:45:08.920539 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:08 crc kubenswrapper[4911]: W1201 00:45:08.930331 4911 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a90d3d_9060_4c92_a342_4526d3eaa801.slice/crio-50ce1e711d0d2a006e9521daed95cf8a2d07cdb9004be887e3e3e20e871dc6d0 WatchSource:0}: Error finding container 50ce1e711d0d2a006e9521daed95cf8a2d07cdb9004be887e3e3e20e871dc6d0: Status 404 returned error can't find the container with id 50ce1e711d0d2a006e9521daed95cf8a2d07cdb9004be887e3e3e20e871dc6d0 Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.447137 4911 generic.go:334] "Generic (PLEG): container finished" podID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerID="1cf7a146320a859ee31c7d6c0e5686b0619c849a45eaf55981028fb76ffddde4" exitCode=0 Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.447213 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerDied","Data":"1cf7a146320a859ee31c7d6c0e5686b0619c849a45eaf55981028fb76ffddde4"} Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.451679 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" event={"ID":"11a90d3d-9060-4c92-a342-4526d3eaa801","Type":"ContainerStarted","Data":"6314beec34759e39327d733205e40f7669a8f876030c8388a6ddf7ac22758f27"} Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.451720 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" event={"ID":"11a90d3d-9060-4c92-a342-4526d3eaa801","Type":"ContainerStarted","Data":"50ce1e711d0d2a006e9521daed95cf8a2d07cdb9004be887e3e3e20e871dc6d0"} Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.455390 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerStarted","Data":"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51"} Dec 01 00:45:09 crc kubenswrapper[4911]: I1201 00:45:09.517815 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" podStartSLOduration=2.41981484 podStartE2EDuration="2.517790974s" podCreationTimestamp="2025-12-01 00:45:07 +0000 UTC" firstStartedPulling="2025-12-01 00:45:08.93536497 +0000 UTC m=+2269.074061741" lastFinishedPulling="2025-12-01 00:45:09.033341104 +0000 UTC m=+2269.172037875" observedRunningTime="2025-12-01 00:45:09.511146595 +0000 UTC m=+2269.649843366" watchObservedRunningTime="2025-12-01 00:45:09.517790974 +0000 UTC m=+2269.656487775" Dec 01 00:45:10 crc kubenswrapper[4911]: I1201 00:45:10.465735 4911 generic.go:334] "Generic (PLEG): container finished" podID="968ade4a-219b-4b9a-966a-1db41435c533" containerID="e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51" exitCode=0 Dec 01 00:45:10 crc kubenswrapper[4911]: I1201 00:45:10.467179 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerDied","Data":"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51"} Dec 01 00:45:11 crc kubenswrapper[4911]: I1201 00:45:11.475291 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerStarted","Data":"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3"} Dec 01 00:45:11 crc kubenswrapper[4911]: I1201 00:45:11.478589 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerStarted","Data":"bc1fe6a9277aef32fbd3ff4d8f36452edafd854980a7b6a82f74541aec479e0f"} Dec 01 00:45:11 crc kubenswrapper[4911]: I1201 00:45:11.495423 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-scx5b" podStartSLOduration=2.889482852 podStartE2EDuration="6.495405583s" podCreationTimestamp="2025-12-01 00:45:05 +0000 UTC" firstStartedPulling="2025-12-01 00:45:07.433307847 +0000 UTC m=+2267.572004618" lastFinishedPulling="2025-12-01 00:45:11.039230578 +0000 UTC m=+2271.177927349" observedRunningTime="2025-12-01 00:45:11.491905933 +0000 UTC m=+2271.630602734" watchObservedRunningTime="2025-12-01 00:45:11.495405583 +0000 UTC m=+2271.634102344" Dec 01 00:45:11 crc kubenswrapper[4911]: I1201 00:45:11.517315 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vs5cb" podStartSLOduration=3.571186597 podStartE2EDuration="6.517300567s" podCreationTimestamp="2025-12-01 00:45:05 +0000 UTC" firstStartedPulling="2025-12-01 00:45:07.430185879 +0000 UTC m=+2267.568882650" lastFinishedPulling="2025-12-01 00:45:10.376299859 +0000 UTC m=+2270.514996620" observedRunningTime="2025-12-01 00:45:11.514658802 +0000 UTC m=+2271.653355573" watchObservedRunningTime="2025-12-01 00:45:11.517300567 +0000 UTC m=+2271.655997338" Dec 01 00:45:15 crc kubenswrapper[4911]: I1201 00:45:15.850137 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:15 crc kubenswrapper[4911]: I1201 00:45:15.850512 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:16 crc kubenswrapper[4911]: I1201 00:45:16.026254 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:16 crc kubenswrapper[4911]: I1201 00:45:16.026333 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:16 crc kubenswrapper[4911]: I1201 00:45:16.070591 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:16 crc kubenswrapper[4911]: I1201 00:45:16.632859 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:16 crc kubenswrapper[4911]: I1201 00:45:16.907487 4911 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-scx5b" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="registry-server" probeResult="failure" output=< Dec 01 00:45:16 crc kubenswrapper[4911]: timeout: failed to connect service ":50051" within 1s Dec 01 00:45:16 crc kubenswrapper[4911]: > Dec 01 00:45:18 crc kubenswrapper[4911]: I1201 00:45:18.292210 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:18 crc kubenswrapper[4911]: I1201 00:45:18.292278 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:18 crc kubenswrapper[4911]: I1201 00:45:18.327713 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:18 crc kubenswrapper[4911]: I1201 00:45:18.621066 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:20 crc kubenswrapper[4911]: I1201 00:45:20.497621 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:20 crc kubenswrapper[4911]: I1201 00:45:20.497839 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vs5cb" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="registry-server" containerID="cri-o://bc1fe6a9277aef32fbd3ff4d8f36452edafd854980a7b6a82f74541aec479e0f" gracePeriod=2 Dec 01 00:45:21 crc kubenswrapper[4911]: I1201 00:45:21.312014 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:45:21 crc kubenswrapper[4911]: I1201 00:45:21.312369 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.300699 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.300964 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" podUID="11a90d3d-9060-4c92-a342-4526d3eaa801" containerName="registry-server" containerID="cri-o://6314beec34759e39327d733205e40f7669a8f876030c8388a6ddf7ac22758f27" gracePeriod=2 Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.669878 4911 generic.go:334] "Generic (PLEG): container finished" podID="11a90d3d-9060-4c92-a342-4526d3eaa801" containerID="6314beec34759e39327d733205e40f7669a8f876030c8388a6ddf7ac22758f27" exitCode=0 Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.669944 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" event={"ID":"11a90d3d-9060-4c92-a342-4526d3eaa801","Type":"ContainerDied","Data":"6314beec34759e39327d733205e40f7669a8f876030c8388a6ddf7ac22758f27"} Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.673411 4911 generic.go:334] "Generic (PLEG): container finished" podID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerID="bc1fe6a9277aef32fbd3ff4d8f36452edafd854980a7b6a82f74541aec479e0f" exitCode=0 Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.673445 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerDied","Data":"bc1fe6a9277aef32fbd3ff4d8f36452edafd854980a7b6a82f74541aec479e0f"} Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.727145 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.794494 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.921450 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjxxh\" (UniqueName: \"kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh\") pod \"11a90d3d-9060-4c92-a342-4526d3eaa801\" (UID: \"11a90d3d-9060-4c92-a342-4526d3eaa801\") " Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.921534 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content\") pod \"8331c199-8cdd-442d-a5eb-7400c386cdb5\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.921598 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities\") pod \"8331c199-8cdd-442d-a5eb-7400c386cdb5\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.921667 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qd22\" (UniqueName: \"kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22\") pod \"8331c199-8cdd-442d-a5eb-7400c386cdb5\" (UID: \"8331c199-8cdd-442d-a5eb-7400c386cdb5\") " Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.922572 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities" (OuterVolumeSpecName: "utilities") pod "8331c199-8cdd-442d-a5eb-7400c386cdb5" (UID: "8331c199-8cdd-442d-a5eb-7400c386cdb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.926672 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh" (OuterVolumeSpecName: "kube-api-access-wjxxh") pod "11a90d3d-9060-4c92-a342-4526d3eaa801" (UID: "11a90d3d-9060-4c92-a342-4526d3eaa801"). InnerVolumeSpecName "kube-api-access-wjxxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.926673 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22" (OuterVolumeSpecName: "kube-api-access-8qd22") pod "8331c199-8cdd-442d-a5eb-7400c386cdb5" (UID: "8331c199-8cdd-442d-a5eb-7400c386cdb5"). InnerVolumeSpecName "kube-api-access-8qd22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:45:22 crc kubenswrapper[4911]: I1201 00:45:22.970321 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8331c199-8cdd-442d-a5eb-7400c386cdb5" (UID: "8331c199-8cdd-442d-a5eb-7400c386cdb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.023766 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qd22\" (UniqueName: \"kubernetes.io/projected/8331c199-8cdd-442d-a5eb-7400c386cdb5-kube-api-access-8qd22\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.024005 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjxxh\" (UniqueName: \"kubernetes.io/projected/11a90d3d-9060-4c92-a342-4526d3eaa801-kube-api-access-wjxxh\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.024126 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.024202 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8331c199-8cdd-442d-a5eb-7400c386cdb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.683745 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" event={"ID":"11a90d3d-9060-4c92-a342-4526d3eaa801","Type":"ContainerDied","Data":"50ce1e711d0d2a006e9521daed95cf8a2d07cdb9004be887e3e3e20e871dc6d0"} Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.684088 4911 scope.go:117] "RemoveContainer" containerID="6314beec34759e39327d733205e40f7669a8f876030c8388a6ddf7ac22758f27" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.683796 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-xbv4m" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.687025 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vs5cb" event={"ID":"8331c199-8cdd-442d-a5eb-7400c386cdb5","Type":"ContainerDied","Data":"6f342698d3dcf75c33fb893ca8b005fa55989d9b605352dd06a8e25995a16f70"} Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.687083 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vs5cb" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.703273 4911 scope.go:117] "RemoveContainer" containerID="bc1fe6a9277aef32fbd3ff4d8f36452edafd854980a7b6a82f74541aec479e0f" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.726174 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.729563 4911 scope.go:117] "RemoveContainer" containerID="1cf7a146320a859ee31c7d6c0e5686b0619c849a45eaf55981028fb76ffddde4" Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.734650 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vs5cb"] Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.740609 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.748525 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-xbv4m"] Dec 01 00:45:23 crc kubenswrapper[4911]: I1201 00:45:23.749317 4911 scope.go:117] "RemoveContainer" containerID="bd0f7021b5ea0b50aad80f6ff41908e72bf0ad687da6c170520f65b0d1780c2f" Dec 01 00:45:24 crc kubenswrapper[4911]: I1201 00:45:24.167943 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a90d3d-9060-4c92-a342-4526d3eaa801" path="/var/lib/kubelet/pods/11a90d3d-9060-4c92-a342-4526d3eaa801/volumes" Dec 01 00:45:24 crc kubenswrapper[4911]: I1201 00:45:24.169265 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" path="/var/lib/kubelet/pods/8331c199-8cdd-442d-a5eb-7400c386cdb5/volumes" Dec 01 00:45:25 crc kubenswrapper[4911]: I1201 00:45:25.904815 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:25 crc kubenswrapper[4911]: I1201 00:45:25.948885 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.103447 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.104253 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-scx5b" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="registry-server" containerID="cri-o://8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3" gracePeriod=2 Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.548856 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.618851 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities\") pod \"968ade4a-219b-4b9a-966a-1db41435c533\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.618905 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgqmv\" (UniqueName: \"kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv\") pod \"968ade4a-219b-4b9a-966a-1db41435c533\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.618980 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content\") pod \"968ade4a-219b-4b9a-966a-1db41435c533\" (UID: \"968ade4a-219b-4b9a-966a-1db41435c533\") " Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.620391 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities" (OuterVolumeSpecName: "utilities") pod "968ade4a-219b-4b9a-966a-1db41435c533" (UID: "968ade4a-219b-4b9a-966a-1db41435c533"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.625403 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv" (OuterVolumeSpecName: "kube-api-access-xgqmv") pod "968ade4a-219b-4b9a-966a-1db41435c533" (UID: "968ade4a-219b-4b9a-966a-1db41435c533"). InnerVolumeSpecName "kube-api-access-xgqmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.720111 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.720143 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgqmv\" (UniqueName: \"kubernetes.io/projected/968ade4a-219b-4b9a-966a-1db41435c533-kube-api-access-xgqmv\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.733718 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "968ade4a-219b-4b9a-966a-1db41435c533" (UID: "968ade4a-219b-4b9a-966a-1db41435c533"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.761619 4911 generic.go:334] "Generic (PLEG): container finished" podID="968ade4a-219b-4b9a-966a-1db41435c533" containerID="8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3" exitCode=0 Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.761669 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerDied","Data":"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3"} Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.761693 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scx5b" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.761713 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scx5b" event={"ID":"968ade4a-219b-4b9a-966a-1db41435c533","Type":"ContainerDied","Data":"d92de6e0334daf55cf4eb5b17bee67f134ef84bda263d06a1414cc3c732fca69"} Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.761791 4911 scope.go:117] "RemoveContainer" containerID="8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.783052 4911 scope.go:117] "RemoveContainer" containerID="e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.800917 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.806325 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-scx5b"] Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.818092 4911 scope.go:117] "RemoveContainer" containerID="bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.821681 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/968ade4a-219b-4b9a-966a-1db41435c533-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.837440 4911 scope.go:117] "RemoveContainer" containerID="8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3" Dec 01 00:45:28 crc kubenswrapper[4911]: E1201 00:45:28.837952 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3\": container with ID starting with 8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3 not found: ID does not exist" containerID="8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.838007 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3"} err="failed to get container status \"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3\": rpc error: code = NotFound desc = could not find container \"8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3\": container with ID starting with 8c188512d46a673910b24fb6eaea6b16ba4376b345ecfcc92a65739da27652e3 not found: ID does not exist" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.838044 4911 scope.go:117] "RemoveContainer" containerID="e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51" Dec 01 00:45:28 crc kubenswrapper[4911]: E1201 00:45:28.838395 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51\": container with ID starting with e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51 not found: ID does not exist" containerID="e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.838431 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51"} err="failed to get container status \"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51\": rpc error: code = NotFound desc = could not find container \"e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51\": container with ID starting with e0632429ac0d9c7582cdb68f722e9f782513fd0f0debb3e18eec9410eaadab51 not found: ID does not exist" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.838482 4911 scope.go:117] "RemoveContainer" containerID="bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d" Dec 01 00:45:28 crc kubenswrapper[4911]: E1201 00:45:28.838733 4911 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d\": container with ID starting with bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d not found: ID does not exist" containerID="bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d" Dec 01 00:45:28 crc kubenswrapper[4911]: I1201 00:45:28.838754 4911 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d"} err="failed to get container status \"bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d\": rpc error: code = NotFound desc = could not find container \"bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d\": container with ID starting with bed103d86302effa9d537cd21fc0d18a610ff1fa8bd85d20139d1ba60f5bfc7d not found: ID does not exist" Dec 01 00:45:30 crc kubenswrapper[4911]: I1201 00:45:30.168021 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968ade4a-219b-4b9a-966a-1db41435c533" path="/var/lib/kubelet/pods/968ade4a-219b-4b9a-966a-1db41435c533/volumes" Dec 01 00:45:51 crc kubenswrapper[4911]: I1201 00:45:51.312022 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:45:51 crc kubenswrapper[4911]: I1201 00:45:51.312801 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.396280 4911 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.396939 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="extract-utilities" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.396973 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="extract-utilities" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.396999 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a90d3d-9060-4c92-a342-4526d3eaa801" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397016 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a90d3d-9060-4c92-a342-4526d3eaa801" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.397054 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="extract-content" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397073 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="extract-content" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.397104 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="extract-utilities" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397121 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="extract-utilities" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.397155 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397172 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.397197 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="extract-content" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397214 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="extract-content" Dec 01 00:45:52 crc kubenswrapper[4911]: E1201 00:45:52.397243 4911 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397260 4911 state_mem.go:107] "Deleted CPUSet assignment" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397573 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="968ade4a-219b-4b9a-966a-1db41435c533" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397621 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="8331c199-8cdd-442d-a5eb-7400c386cdb5" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.397656 4911 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a90d3d-9060-4c92-a342-4526d3eaa801" containerName="registry-server" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.399980 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.409113 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.440414 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbm2\" (UniqueName: \"kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.440509 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.440565 4911 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.541739 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbm2\" (UniqueName: \"kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.541817 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.541842 4911 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.542358 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.542401 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.562348 4911 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbm2\" (UniqueName: \"kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2\") pod \"community-operators-bxcnb\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.729561 4911 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.825349 4911 scope.go:117] "RemoveContainer" containerID="27aebb5f490bef73029556fb30a80f363cc8f09346152087a96aee6d51687f4d" Dec 01 00:45:52 crc kubenswrapper[4911]: I1201 00:45:52.998725 4911 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:45:54 crc kubenswrapper[4911]: I1201 00:45:54.008949 4911 generic.go:334] "Generic (PLEG): container finished" podID="cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" containerID="0f5ab8cdfd5fe8a01cdfecb6f8565a612c25964a4163ccdad765161fdacf9583" exitCode=0 Dec 01 00:45:54 crc kubenswrapper[4911]: I1201 00:45:54.009159 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerDied","Data":"0f5ab8cdfd5fe8a01cdfecb6f8565a612c25964a4163ccdad765161fdacf9583"} Dec 01 00:45:54 crc kubenswrapper[4911]: I1201 00:45:54.009266 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerStarted","Data":"c2ae1da12ead9c08a1ab48939dc546742609543ae5dd1eb0489cc01aa28d2563"} Dec 01 00:45:56 crc kubenswrapper[4911]: I1201 00:45:56.030339 4911 generic.go:334] "Generic (PLEG): container finished" podID="cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" containerID="f2537fe33870cf2945363a03c6110cf151ceac2b999114edca1e074a64bb15b5" exitCode=0 Dec 01 00:45:56 crc kubenswrapper[4911]: I1201 00:45:56.030492 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerDied","Data":"f2537fe33870cf2945363a03c6110cf151ceac2b999114edca1e074a64bb15b5"} Dec 01 00:45:57 crc kubenswrapper[4911]: I1201 00:45:57.043034 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerStarted","Data":"6c4742da9b9ecd75085ce2ff9f112f97b2d61610baae8a9c5a6cc5cec0a78a68"} Dec 01 00:45:57 crc kubenswrapper[4911]: I1201 00:45:57.072594 4911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxcnb" podStartSLOduration=2.4366785650000002 podStartE2EDuration="5.072569809s" podCreationTimestamp="2025-12-01 00:45:52 +0000 UTC" firstStartedPulling="2025-12-01 00:45:54.011293788 +0000 UTC m=+2314.149990559" lastFinishedPulling="2025-12-01 00:45:56.647185002 +0000 UTC m=+2316.785881803" observedRunningTime="2025-12-01 00:45:57.060931017 +0000 UTC m=+2317.199627838" watchObservedRunningTime="2025-12-01 00:45:57.072569809 +0000 UTC m=+2317.211266580" Dec 01 00:46:02 crc kubenswrapper[4911]: I1201 00:46:02.730574 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:02 crc kubenswrapper[4911]: I1201 00:46:02.732005 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:02 crc kubenswrapper[4911]: I1201 00:46:02.769195 4911 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:03 crc kubenswrapper[4911]: I1201 00:46:03.171621 4911 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:03 crc kubenswrapper[4911]: I1201 00:46:03.241358 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:46:05 crc kubenswrapper[4911]: I1201 00:46:05.114823 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxcnb" podUID="cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" containerName="registry-server" containerID="cri-o://6c4742da9b9ecd75085ce2ff9f112f97b2d61610baae8a9c5a6cc5cec0a78a68" gracePeriod=2 Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.131627 4911 generic.go:334] "Generic (PLEG): container finished" podID="cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" containerID="6c4742da9b9ecd75085ce2ff9f112f97b2d61610baae8a9c5a6cc5cec0a78a68" exitCode=0 Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.131771 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerDied","Data":"6c4742da9b9ecd75085ce2ff9f112f97b2d61610baae8a9c5a6cc5cec0a78a68"} Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.660033 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.678237 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content\") pod \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.678319 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities\") pod \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.678491 4911 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkbm2\" (UniqueName: \"kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2\") pod \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\" (UID: \"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5\") " Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.680475 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities" (OuterVolumeSpecName: "utilities") pod "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" (UID: "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.691831 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2" (OuterVolumeSpecName: "kube-api-access-rkbm2") pod "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" (UID: "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5"). InnerVolumeSpecName "kube-api-access-rkbm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.747355 4911 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" (UID: "cfa914ae-d540-4eeb-ab49-365b6d1bb2d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.780697 4911 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkbm2\" (UniqueName: \"kubernetes.io/projected/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-kube-api-access-rkbm2\") on node \"crc\" DevicePath \"\"" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.780726 4911 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 00:46:06 crc kubenswrapper[4911]: I1201 00:46:06.780737 4911 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.146670 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxcnb" event={"ID":"cfa914ae-d540-4eeb-ab49-365b6d1bb2d5","Type":"ContainerDied","Data":"c2ae1da12ead9c08a1ab48939dc546742609543ae5dd1eb0489cc01aa28d2563"} Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.146725 4911 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxcnb" Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.146758 4911 scope.go:117] "RemoveContainer" containerID="6c4742da9b9ecd75085ce2ff9f112f97b2d61610baae8a9c5a6cc5cec0a78a68" Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.171209 4911 scope.go:117] "RemoveContainer" containerID="f2537fe33870cf2945363a03c6110cf151ceac2b999114edca1e074a64bb15b5" Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.259544 4911 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.263932 4911 scope.go:117] "RemoveContainer" containerID="0f5ab8cdfd5fe8a01cdfecb6f8565a612c25964a4163ccdad765161fdacf9583" Dec 01 00:46:07 crc kubenswrapper[4911]: I1201 00:46:07.267366 4911 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxcnb"] Dec 01 00:46:08 crc kubenswrapper[4911]: I1201 00:46:08.162312 4911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa914ae-d540-4eeb-ab49-365b6d1bb2d5" path="/var/lib/kubelet/pods/cfa914ae-d540-4eeb-ab49-365b6d1bb2d5/volumes" Dec 01 00:46:21 crc kubenswrapper[4911]: I1201 00:46:21.311771 4911 patch_prober.go:28] interesting pod/machine-config-daemon-cp4w9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 00:46:21 crc kubenswrapper[4911]: I1201 00:46:21.312498 4911 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 00:46:21 crc kubenswrapper[4911]: I1201 00:46:21.312572 4911 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" Dec 01 00:46:21 crc kubenswrapper[4911]: I1201 00:46:21.313586 4911 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069"} pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 00:46:21 crc kubenswrapper[4911]: I1201 00:46:21.313691 4911 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" containerName="machine-config-daemon" containerID="cri-o://c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069" gracePeriod=600 Dec 01 00:46:21 crc kubenswrapper[4911]: E1201 00:46:21.445378 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:46:22 crc kubenswrapper[4911]: I1201 00:46:22.285893 4911 generic.go:334] "Generic (PLEG): container finished" podID="470f170b-eeab-4f43-bd48-18e50771289a" containerID="c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069" exitCode=0 Dec 01 00:46:22 crc kubenswrapper[4911]: I1201 00:46:22.285968 4911 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" event={"ID":"470f170b-eeab-4f43-bd48-18e50771289a","Type":"ContainerDied","Data":"c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069"} Dec 01 00:46:22 crc kubenswrapper[4911]: I1201 00:46:22.286248 4911 scope.go:117] "RemoveContainer" containerID="7095373e0714dc0a7569921cf9a51bfd5df924973bf8fa1f16cfab9c4c56c587" Dec 01 00:46:22 crc kubenswrapper[4911]: I1201 00:46:22.286829 4911 scope.go:117] "RemoveContainer" containerID="c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069" Dec 01 00:46:22 crc kubenswrapper[4911]: E1201 00:46:22.287076 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a" Dec 01 00:46:36 crc kubenswrapper[4911]: I1201 00:46:36.151351 4911 scope.go:117] "RemoveContainer" containerID="c07f648bd011b819f9a67d8e6d9d143c4e3dc0ff207216d43084a12e295c6069" Dec 01 00:46:36 crc kubenswrapper[4911]: E1201 00:46:36.152161 4911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp4w9_openshift-machine-config-operator(470f170b-eeab-4f43-bd48-18e50771289a)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp4w9" podUID="470f170b-eeab-4f43-bd48-18e50771289a"